muHVT: Predicting Cells and Layers using predictLayerHVT to monitor entities over time

Zubin Dowlaty, Somya Shambhawi

2023-06-07

1 Abstract

The muHVT package is a collection of R functions to facilitate building topology preserving maps for rich multivariate data. Tending towards a big data preponderance, a large number of rows. A collection of R functions for this typical workflow is organized below :

  1. Data Compression: Vector quantization (VQ), HVQ (hierarchical vector quantization) using means or medians. This step compresses the rows (long data frame) using a compression objective

  2. Data Projection: Dimension projection of the compressed cells to 1D,2D and 3D with the Sammons Non-linear Algorithm. This step creates topology preserving map (also called as embedding) coordinates into the desired output dimension .

  3. Tessellation: Create cells required for object visualization using the Voronoi Tessellation method, package includes heatmap plots for hierarchical Voronoi tessellations (HVT). This step enables data insights, visualization, and interaction with the topology preserving map. Useful for semi-supervised tasks

  4. Prediction: Scoring new data sets and recording their assignment using the map objects from the above steps, in a sequence of maps if required

2 Data Understanding

In this vignette, we will use the Prices of Personal Computers dataset. This dataset contains 6261 observations and 6 features. The dataset observes the price from 1993 to 1995 of 486 personal computers in the US. The variables are price, speed, hd, ram, screen and ads.

Here, we load the training and the testing dataset.

set.seed(240)
# Load data from csv files
trainComputers <- read.csv("https://raw.githubusercontent.com/Mu-Sigma/muHVT/dev/vignettes/sample_dataset/trainComputers.csv")
testComputers <- read.csv("https://raw.githubusercontent.com/Mu-Sigma/muHVT/dev/vignettes/sample_dataset/testComputers.csv")

Let’s have a look at the scaled training dataset containing 5008 data points

# Quick peek
trainComputers <- scale(trainComputers) 
metric_list <- colnames(trainComputers)
scale_attr <- attributes(trainComputers)

trainComputers <- trainComputers %>% as.data.frame()
trainComputers1 <- round(trainComputers,4)
Table(head(trainComputers1))
price speed hd ram screen ads
-1.2973 -1.1954 -1.3136 -0.7176 -0.615 -2.388
-0.7999 -0.7834 -1.2898 -1.1078 -0.615 -2.388
-1.1360 -1.1954 -0.8854 -0.7176 0.549 -2.388
-0.7091 -1.1954 -0.8854 0.0630 -0.615 -2.388
1.7210 -0.7834 -0.0767 1.6240 -0.615 -2.388
2.3932 0.9159 -0.0767 1.6240 -0.615 -2.388

Now let us check the structure of the training data and analyse its summary.

str(trainComputers)
#> 'data.frame':    5008 obs. of  6 variables:
#>  $ price : num  -1.297 -0.8 -1.136 -0.709 1.721 ...
#>  $ speed : num  -1.195 -0.783 -1.195 -1.195 -0.783 ...
#>  $ hd    : num  -1.3136 -1.2898 -0.8854 -0.8854 -0.0767 ...
#>  $ ram   : num  -0.718 -1.108 -0.718 0.063 1.624 ...
#>  $ screen: num  -0.615 -0.615 0.549 -0.615 -0.615 ...
#>  $ ads   : num  -2.39 -2.39 -2.39 -2.39 -2.39 ...
summary(trainComputers)
#>      price             speed               hd                ram          
#>  Min.   :-2.2216   Min.   :-1.1954   Min.   :-1.31361   Min.   :-1.10781  
#>  1st Qu.:-0.7520   1st Qu.:-0.7834   1st Qu.:-0.68562   1st Qu.:-0.71755  
#>  Median :-0.1276   Median : 0.0920   Median :-0.07666   Median : 0.06296  
#>  Mean   : 0.0000   Mean   : 0.0000   Mean   : 0.00000   Mean   : 0.00000  
#>  3rd Qu.: 0.6269   3rd Qu.: 0.9159   3rd Qu.: 0.44666   3rd Qu.: 0.06296  
#>  Max.   : 5.2569   Max.   : 2.6667   Max.   : 8.29651   Max.   : 4.74606  
#>      screen            ads         
#>  Min.   :-0.615   Min.   :-2.3880  
#>  1st Qu.:-0.615   1st Qu.:-0.4416  
#>  Median :-0.615   Median : 0.2444  
#>  Mean   : 0.000   Mean   : 0.0000  
#>  3rd Qu.: 0.549   3rd Qu.: 0.6273  
#>  Max.   : 2.877   Max.   : 1.5207

3 Map A : Base Compressed Map

This package can perform vector quantization using the following algorithms -

For more information on vector quantization, refer the following link.

The HVT function constructs highly compressed hierarchical Voronoi tessellations. The raw data is first scaled and this scaled data is supplied as input to the vector quantization algorithm. The vector quantization algorithm compresses the dataset until a user-defined compression percentage/rate is achieved using a parameter called quantization error which acts as a threshold and determines the compression percentage. It means that for a given user-defined compression percentage we get the ‘n’ number of cells, then all of these cells formed will have a quantization error less than the threshold quantization error.

Let’s try to comprehend the HVT function first before moving ahead.

HVT(
  dataset,
  min_compression_perc,
  n_cells,
  depth,
  quant.err,
  distance_metric = c("L1_Norm", "L2_Norm"),
  error_metric = c("mean", "max"),
  quant_method = c("kmeans", "kmedoids"),
  normalize = TRUE,
  diagnose = FALSE,
  hvt_validation = FALSE,
  train_validation_split_ratio = 0.8
)

Each of the parameters have been explained below :

We will use the HVT function to compress our data while preserving essential features of the dataset. Our goal is to achieve data compression upto atleast 80%. In situations where the compression ratio does not meet the desired target, we can explore adjusting the model parameters as a potential solution. This involves making modifications to parameters such as the quantization error threshold or increasing the number of cells and then rerunning the HVT function again.

In our example we will iteratively increase the number of cells until the desired compression percentage is reached instead of increasing the quantization threshold because it may reduce the level of detail captured in the data representation

First, we will construct map A by using the below mentioned model parameters.

Model Parameters

map_A <- list()
map_A <-muHVT::HVT(trainComputers,
                n_cells = 450,
                quant.err = 0.2,
                depth = 1,
                distance_metric = "L1_Norm",
                error_metric = "max",
                quant_method = "kmeans",
                normalize = F
)

As per the manual, map_A[[3]] gives us detailed information about the hierarchical vector quantized data. map_A[[3]][['summary']] gives a nice tabular data containing no of points, Quantization Error and the codebook.

The datatable displayed below is the summary from map A

summaryTable(map_A[[3]]$summary,scroll = T,limit = 500)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error price speed hd ram screen ads
1 1 1 9 316 0.25 0.72 0.92 -0.32 -0.02 0.55 -1.15
1 1 2 9 147 0.13 -1.19 -0.78 0.06 -0.72 0.55 -0.49
1 1 3 8 67 0.05 -1.06 -1.20 -0.70 -0.72 -0.61 0.94
1 1 4 6 304 0.16 -0.47 0.92 0.24 0.06 0.55 -1.30
1 1 5 12 188 0.16 -0.93 0.92 -0.01 -0.72 -0.61 0.43
1 1 6 14 109 0.09 -1.50 -0.78 -0.08 -0.72 -0.61 -0.09
1 1 7 10 89 0.14 0.18 0.92 -1.05 -0.72 -0.61 -2.29
1 1 8 7 286 0.11 0.43 0.09 0.49 0.06 -0.61 -0.56
1 1 9 11 11 0.14 -1.49 -1.05 -1.18 -1.00 -0.61 1.52
1 1 10 10 390 0.24 0.45 0.96 0.72 1.62 0.55 -0.70
1 1 11 5 367 0.04 0.56 0.92 0.82 1.62 -0.61 -0.30
1 1 12 9 382 0.09 0.78 0.92 0.82 1.62 -0.61 1.29
1 1 13 8 78 0.17 -1.40 -0.78 0.24 -0.72 -0.61 -1.19
1 1 14 15 111 0.07 -0.61 -1.20 -0.68 -0.72 -0.61 0.14
1 1 15 6 307 0.21 -0.01 2.67 -0.08 -0.20 -0.61 0.92
1 1 16 9 63 0.06 -1.27 -0.78 -0.97 -0.72 -0.61 0.93
1 1 17 9 349 0.11 0.17 2.67 0.26 0.06 -0.61 1.52
1 1 18 7 322 0.27 -0.11 -0.84 0.46 -0.27 2.88 0.13
1 1 19 10 432 0.06 1.35 -0.78 3.06 3.19 -0.61 -0.11
1 1 20 15 424 0.32 1.22 2.67 0.78 1.62 0.55 1.11
1 1 21 13 248 0.12 -0.03 -0.82 0.11 0.06 0.55 0.15
1 1 22 5 331 0.14 -0.45 0.92 -0.20 -0.72 2.88 -0.64
1 1 23 10 151 0.05 0.00 -0.78 -0.68 -0.72 -0.61 0.14
1 1 24 13 374 0.06 1.28 0.09 0.84 1.62 -0.61 0.76
1 1 25 5 380 0.24 2.20 0.92 0.51 -0.25 0.55 -0.80
1 1 26 11 50 0.08 -1.65 -1.20 -0.66 -0.72 -0.61 0.32
1 1 27 14 57 0.13 -1.08 -1.20 -0.82 -0.72 0.55 0.73
1 1 28 14 345 0.26 1.21 0.92 0.58 0.01 0.55 0.06
1 1 29 19 394 0.08 1.44 0.92 0.83 1.62 -0.61 0.72
1 1 30 13 123 0.08 -0.45 0.09 -1.19 -1.11 -0.61 0.10
1 1 31 20 396 0.15 0.28 -0.78 1.73 1.62 0.55 -0.64
1 1 32 6 90 0.18 -0.68 -0.85 -0.66 -0.72 0.55 1.52
1 1 33 11 74 0.08 -0.62 -0.82 -1.01 -0.72 -0.61 -1.08
1 1 34 6 372 0.13 1.76 0.92 0.82 0.06 0.55 0.84
1 1 35 15 337 0.07 0.66 -0.78 0.82 1.62 -0.61 0.31
1 1 36 12 220 0.13 -0.80 -0.78 0.19 0.06 0.55 0.51
1 1 37 11 305 0.19 1.18 0.92 -0.46 -0.72 0.55 0.32
1 1 38 6 71 0.06 -1.65 -0.78 -0.69 -0.72 -0.61 0.25
1 1 39 18 436 0.19 1.33 -0.78 3.06 3.19 0.55 -0.38
1 1 40 24 267 0.08 -0.34 0.92 0.32 0.06 -0.61 -0.16
1 1 41 9 293 0.08 0.10 0.92 0.39 0.06 -0.61 0.99
1 1 42 19 49 0.06 -0.99 -0.78 -1.19 -1.11 -0.61 0.75
1 1 43 6 273 0.13 0.42 -0.78 0.77 0.06 -0.61 -1.00
1 1 44 3 262 0.05 1.21 -0.78 -0.08 0.06 -0.61 0.50
1 1 45 9 371 0.18 2.17 0.92 0.77 -0.72 0.55 0.38
1 1 46 5 117 0.05 -0.37 -0.78 -1.08 -0.72 -0.61 0.60
1 1 47 5 52 0.02 -1.55 -0.78 -1.08 -0.72 -0.61 0.06
1 1 48 9 234 0.11 -0.34 -0.97 0.83 0.06 -0.61 0.46
1 1 49 5 413 0.18 1.12 0.92 -0.42 0.06 2.88 -1.19
1 1 50 27 213 0.12 0.28 0.09 -0.59 0.06 -0.61 -2.29
1 1 51 5 421 0.2 0.29 0.92 2.04 0.06 2.88 -0.71
1 1 52 22 419 0.42 1.00 2.67 1.09 1.62 -0.40 -0.09
1 1 53 16 407 0.36 1.34 0.56 -0.25 1.62 -0.18 -2.26
1 1 54 4 193 0.1 -1.14 0.09 -0.08 0.06 -0.61 -0.42
1 1 55 11 134 0.07 -0.42 -0.78 -0.59 -0.72 -0.61 -0.41
1 1 56 8 86 0.05 -0.92 -0.78 -1.12 -0.72 -0.61 0.45
1 1 57 13 142 0.11 -1.00 -0.78 0.23 -0.72 -0.61 -0.34
1 1 58 19 84 0.08 -1.04 -1.20 -0.69 -0.72 -0.61 0.35
1 1 59 7 12 0.1 -1.46 -1.20 -1.19 -1.11 0.55 0.63
1 1 60 10 437 0.08 1.53 0.92 3.06 3.19 -0.61 -0.11
1 1 61 16 388 0.1 1.40 0.92 0.71 1.62 -0.61 0.13
1 1 62 4 338 0.16 1.85 -0.13 0.45 0.06 0.55 0.26
1 1 63 10 239 0.16 -0.54 0.92 -0.04 -0.72 0.55 0.50
1 1 64 9 224 0.04 -0.15 -0.78 0.33 0.06 -0.61 0.73
1 1 65 21 384 0.17 1.26 0.92 -0.07 1.62 0.55 0.40
1 1 66 11 393 0.21 0.23 0.92 0.67 0.06 2.88 -0.12
1 1 67 8 333 0.18 -0.29 0.92 1.78 0.06 0.40 0.07
1 1 68 22 199 0.2 0.24 -0.78 -0.64 -0.04 0.55 -2.31
1 1 69 9 163 0.12 -0.38 -0.78 -0.57 0.06 -0.61 -1.41
1 1 70 9 246 0.14 -0.65 0.92 0.00 0.06 -0.61 0.15
1 1 71 11 48 0.06 -1.14 -1.20 -1.13 -0.72 -0.61 0.72
1 1 72 5 326 0.19 0.91 0.59 0.84 0.06 -0.61 -1.08
1 1 73 8 214 0.1 0.25 -0.89 -0.09 0.06 -0.61 -1.67
1 1 74 18 181 0.14 0.17 0.09 -0.67 -0.72 -0.61 -0.02
1 1 75 10 2 0.07 -1.14 -0.78 -1.29 -1.11 -0.61 -2.28
1 1 76 15 21 0.15 -1.31 -1.06 -1.12 -0.87 -0.61 -1.08
1 1 77 3 450 0.04 5.26 0.92 4.01 4.75 2.88 0.46
1 1 78 15 83 0.06 -1.19 -0.78 -0.67 -0.72 -0.61 0.90
1 1 79 9 105 0.11 -1.23 0.09 -0.88 -0.72 -0.61 0.81
1 1 80 11 297 0.13 0.88 0.92 -0.15 0.06 -0.61 0.10
1 1 81 5 106 0.02 -0.79 -0.78 -0.89 -0.72 -0.61 0.07
1 1 82 23 420 0.39 2.33 0.92 0.30 -0.04 2.88 0.60
1 1 83 21 320 0.23 0.16 0.92 0.87 0.03 0.55 0.38
1 1 84 14 54 0.06 -1.02 -0.78 -1.19 -1.11 -0.61 0.14
1 1 85 13 242 0.12 0.08 -0.88 0.41 0.06 -0.61 -0.43
1 1 86 9 373 0.25 1.38 0.92 0.03 0.06 0.55 -2.18
1 1 87 17 260 0.09 -0.62 0.92 0.31 0.06 -0.61 -0.70
1 1 88 12 282 0.3 -0.16 2.67 -0.10 -0.52 -0.61 -0.31
1 1 89 13 292 0.09 0.15 0.09 0.33 0.06 0.55 0.38
1 1 90 12 173 0.21 -0.82 0.09 -0.43 -0.75 0.55 -0.56
1 1 91 3 441 0.07 1.52 0.92 3.06 3.19 0.55 0.08
1 1 92 14 228 0.1 -0.32 -0.78 0.44 0.06 -0.61 1.52
1 1 93 22 124 0.08 -0.48 -0.78 -0.67 -0.72 -0.61 0.86
1 1 94 18 325 0.15 0.48 0.94 0.49 0.06 0.55 0.85
1 1 95 12 135 0.16 -1.06 0.09 -0.49 -0.72 -0.61 -0.66
1 1 96 12 87 0.06 -1.35 -0.78 -0.68 -0.72 -0.61 0.11
1 1 97 7 55 0.21 -0.74 0.45 -1.00 -0.94 0.55 -1.31
1 1 98 7 259 0.1 -0.55 -0.78 0.82 0.06 0.55 -0.88
1 1 99 5 204 0.09 -0.21 -0.78 -0.58 0.06 0.55 -0.99
1 1 100 14 212 0.16 -0.79 0.92 0.34 -0.72 -0.61 -0.53
1 1 101 11 88 0.08 -1.52 -0.78 -0.08 -0.72 -0.61 -0.68
1 1 102 12 196 0.16 0.71 -0.78 -0.16 -0.72 -0.61 0.50
1 1 103 8 269 0.22 -0.26 2.67 -0.73 -0.82 0.55 1.02
1 1 104 11 180 0.19 -0.57 0.92 -1.00 -0.97 0.55 0.15
1 1 105 9 354 0.23 -0.23 0.92 1.91 0.06 0.55 -0.95
1 1 106 20 402 0.26 0.86 0.92 0.71 1.62 0.55 1.24
1 1 107 14 122 0.08 -0.23 -0.78 -1.12 -0.72 -0.61 0.06
1 1 108 12 80 0.06 -1.49 -0.78 -0.64 -0.72 -0.61 0.49
1 1 109 9 97 0.06 -1.12 -0.78 -0.68 -0.72 -0.61 -0.01
1 1 110 14 287 0.11 -0.51 0.92 0.83 0.06 -0.61 -0.70
1 1 111 10 247 0.06 0.38 -0.78 0.33 0.06 -0.61 0.70
1 1 112 7 104 0.06 -0.34 -1.20 -1.02 -0.72 -0.61 0.48
1 1 113 10 218 0.09 -0.36 -0.78 0.43 0.06 -0.61 0.97
1 1 114 8 207 0.07 0.26 0.92 -0.68 -0.72 -0.61 0.64
1 1 115 7 348 0.1 1.55 0.92 0.83 0.06 -0.61 1.07
1 1 116 7 191 0.12 0.75 -0.78 -0.28 -0.72 -0.61 -0.06
1 1 117 16 69 0.14 -0.86 -0.83 -0.72 -0.72 -0.61 1.52
1 1 118 18 103 0.05 -0.88 -0.78 -0.66 -0.72 -0.61 0.90
1 1 119 12 24 0.12 -1.50 -1.16 -1.15 -0.78 -0.61 -0.46
1 1 120 6 34 0.12 0.41 -0.78 -0.77 -0.72 -0.61 -2.33
1 1 121 28 167 0.17 0.19 -0.89 -0.57 0.06 -0.61 -2.30
1 1 122 5 219 0.08 -0.85 -0.78 0.24 0.06 0.55 -0.33
1 1 123 6 288 0.16 0.22 -0.78 0.89 -0.07 0.55 0.48
1 1 124 24 395 0.13 1.20 0.92 0.45 1.62 0.55 0.43
1 1 125 14 369 0.14 1.26 0.09 0.69 1.62 -0.61 -0.08
1 1 126 8 23 0.06 -1.52 -1.20 -1.19 -1.11 -0.61 0.33
1 1 127 14 4 0.23 -1.28 -1.20 -0.96 -0.66 -0.61 -2.29
1 1 128 21 347 0.12 1.09 -0.80 0.68 1.62 -0.61 0.11
1 1 129 17 22 0.07 -0.39 -0.78 -1.01 -0.72 -0.61 -2.28
1 1 130 3 257 0.02 -0.45 -0.78 0.82 0.06 0.55 0.07
1 1 131 6 9 0.09 -0.91 0.09 -1.25 -1.11 -0.61 -2.06
1 1 132 20 277 0.11 0.46 0.92 -0.58 0.06 -0.61 -2.28
1 1 133 9 298 0.26 1.30 0.55 -0.70 -0.28 -0.61 -2.27
1 1 134 10 68 0.07 -1.55 -0.78 -0.63 -0.72 -0.61 0.87
1 1 135 9 208 0.06 -0.40 -1.20 0.33 0.06 -0.61 0.28
1 1 136 10 100 0.06 -0.67 -1.20 -0.65 -0.72 -0.61 -0.44
1 1 137 20 139 0.18 -0.65 -0.87 -0.62 -0.72 0.55 0.23
1 1 138 7 370 0.07 0.68 0.92 0.82 1.62 -0.61 0.44
1 1 139 12 95 0.05 -0.73 -0.78 -1.12 -0.72 -0.61 0.13
1 1 140 15 37 0.11 -1.42 -0.92 -0.69 -0.72 -0.61 1.52
1 1 141 8 263 0.17 0.50 0.09 -0.15 0.06 -0.61 1.03
1 1 142 11 295 0.11 0.73 0.09 -0.15 0.06 0.55 0.19
1 1 143 14 410 0.26 1.62 0.92 0.68 1.62 0.55 -0.25
1 1 144 6 3 0.22 -1.00 -0.92 -0.63 -0.91 2.88 -0.17
1 1 145 9 40 0.09 -0.56 1.38 -1.19 -1.11 -0.61 1.24
1 1 146 15 339 0.15 0.24 -1.03 0.82 1.62 -0.61 1.35
1 1 147 9 10 0.15 -0.74 -0.92 -0.76 -0.72 0.55 -2.35
1 1 148 20 187 0.16 -0.61 0.09 0.20 -0.72 -0.61 0.59
1 1 149 12 148 0.13 -1.07 0.92 -0.75 -0.72 -0.61 0.70
1 1 150 8 375 0.08 0.75 2.67 0.39 0.06 -0.61 1.52
1 1 151 14 389 0.21 2.47 0.92 0.72 0.06 0.55 0.32
1 1 152 18 344 0.09 0.62 -1.20 0.83 1.62 -0.61 0.74
1 1 153 7 309 0.1 0.30 0.92 0.82 0.06 -0.61 -0.69
1 1 154 17 357 0.32 0.84 -0.90 -0.13 1.62 -0.48 -2.27
1 1 155 10 332 0.18 0.12 0.96 0.75 0.06 0.55 1.52
1 1 156 16 129 0.17 -1.10 -0.81 -0.03 -0.72 -0.61 0.41
1 1 157 6 434 0.31 2.85 0.07 -0.06 1.62 2.88 -1.28
1 1 158 20 358 0.26 0.54 0.09 0.01 -0.05 2.88 0.53
1 1 159 13 179 0.28 0.42 -0.88 -0.67 -0.72 2.88 1.15
1 1 160 14 5 0.27 -0.61 -0.96 -0.68 -0.72 2.88 0.99
1 1 161 18 275 0.11 0.42 0.09 0.36 0.06 -0.61 0.73
1 1 162 12 8 0.08 -0.73 -1.20 -1.02 -0.72 -0.61 -2.33
1 1 163 14 310 0.12 1.00 0.92 0.15 0.06 -0.61 0.84
1 1 164 11 271 0.12 0.09 0.09 -0.16 0.06 0.55 0.68
1 1 165 5 250 0.11 -0.22 -0.95 0.33 0.06 0.55 -0.32
1 1 166 8 101 0.05 -0.78 -0.78 -0.89 -0.72 -0.61 0.78
1 1 167 9 426 0.31 2.13 2.67 1.77 0.06 -0.10 -0.57
1 1 168 16 156 0.11 -0.96 0.92 -0.82 -0.72 -0.61 0.08
1 1 169 4 98 0.07 -1.64 -0.78 -0.08 -0.72 -0.61 0.60
1 1 170 10 85 0.12 -1.26 -0.91 -0.71 -0.72 0.55 0.08
1 1 171 9 302 0.21 0.68 -0.01 -0.06 0.06 0.55 -0.58
1 1 172 10 351 0.1 0.96 -0.91 0.82 1.62 -0.61 -0.44
1 1 173 13 336 0.1 0.65 -1.13 0.62 1.62 -0.61 0.09
1 1 174 7 240 0.23 0.53 -0.78 -0.14 -0.49 0.55 0.81
1 1 175 6 329 0.14 1.40 0.92 0.58 0.06 -0.61 0.30
1 1 176 16 272 0.12 0.58 0.09 0.10 0.06 -0.61 0.16
1 1 177 10 429 0.06 1.03 -0.78 3.06 3.19 -0.61 -0.11
1 1 178 6 154 0.13 -1.33 -0.92 -0.08 0.06 -0.61 -0.92
1 1 179 14 366 0.13 0.40 2.67 0.51 0.06 0.55 0.44
1 1 180 7 15 0.17 -1.42 -0.96 -0.89 -0.83 0.55 1.52
1 1 181 10 153 0.15 -1.15 0.09 -0.21 -0.72 -0.61 0.41
1 1 182 2 236 0.06 -0.96 0.09 -0.08 0.06 0.55 0.08
1 1 183 14 131 0.23 -0.88 0.09 -0.91 -0.91 0.55 0.86
1 1 184 10 314 0.11 -0.47 0.92 0.86 0.06 0.55 -1.05
1 1 185 11 252 0.13 -0.16 0.09 0.30 0.06 -0.61 0.83
1 1 186 14 211 0.12 -0.43 -0.78 -0.27 0.06 0.55 0.78
1 1 187 6 185 0.16 0.21 0.78 -1.15 -0.72 -0.61 0.77
1 1 188 16 360 0.24 0.75 -0.86 0.62 1.62 0.55 -0.05
1 1 189 34 392 0.25 0.76 0.92 0.03 -0.01 2.88 0.51
1 1 190 11 400 0.28 1.26 0.77 0.08 1.62 0.55 -1.07
1 1 191 3 27 0.09 -1.97 -0.78 -0.50 -1.11 -0.61 -0.16
1 1 192 3 422 0.19 1.89 0.07 0.87 0.06 2.88 -1.08
1 1 193 20 186 0.14 0.01 0.09 -0.44 -0.72 -0.61 0.67
1 1 194 10 435 0.57 1.19 0.67 2.15 1.62 2.88 -0.66
1 1 195 4 401 0.04 1.09 1.38 0.82 1.62 -0.61 1.01
1 1 196 13 409 0.11 0.52 0.09 1.73 1.62 0.55 -0.93
1 1 197 9 385 0.26 0.83 -0.20 -0.45 -0.02 2.88 -1.23
1 1 198 4 308 0.09 2.01 -0.78 0.46 0.06 -0.61 0.52
1 1 199 5 166 0.24 -1.40 -0.95 0.10 0.06 -0.61 -0.08
1 1 200 15 294 0.2 0.12 0.92 -0.19 0.06 0.55 0.81
1 1 201 14 278 0.24 -0.52 0.92 0.60 0.01 -0.61 -1.30
1 1 202 9 81 0.07 -0.94 0.09 -1.19 -1.11 -0.61 0.41
1 1 203 2 379 0.07 0.22 1.15 -0.08 1.62 0.55 1.52
1 1 204 14 58 0.15 -0.41 -0.81 -0.73 0.06 -0.61 -2.32
1 1 205 7 449 0.71 3.86 1.92 1.68 0.29 2.88 -0.28
1 1 206 13 41 0.06 -1.48 -1.20 -1.14 -0.72 -0.61 0.50
1 1 207 7 73 0.06 -1.28 -0.78 -1.00 -0.72 -0.61 0.09
1 1 208 10 233 0.06 -0.40 -0.78 0.82 0.06 -0.61 0.07
1 1 209 5 19 0.1 -0.53 0.09 -1.14 -0.80 -0.61 -2.32
1 1 210 22 403 0.39 2.70 0.30 0.47 0.06 -0.61 -2.27
1 1 211 4 364 0.1 0.20 -0.99 0.82 1.62 0.55 1.27
1 1 212 4 324 0.08 1.11 0.92 0.12 0.06 -0.61 1.52
1 1 213 15 114 0.05 -0.88 -0.78 -0.65 -0.72 -0.61 0.10
1 1 214 13 155 0.1 -0.52 0.09 -0.73 -0.72 -0.61 0.26
1 1 215 7 172 0.1 -0.32 -0.84 -0.61 0.06 -0.61 0.87
1 1 216 8 416 0.41 0.96 -0.78 4.29 -0.03 -0.18 0.49
1 1 217 11 118 0.13 -0.91 0.92 -0.87 -0.86 -0.61 1.52
1 1 218 5 397 0.04 0.56 0.92 1.73 1.62 -0.61 0.07
1 1 219 11 318 0.13 0.94 0.92 -0.12 0.06 0.55 0.02
1 1 220 16 417 0.23 1.24 0.50 -0.55 -0.23 2.88 -2.30
1 1 221 9 39 0.06 -0.13 0.09 -1.01 -0.72 -0.61 -2.29
1 1 222 14 206 0.13 -0.50 -0.81 0.27 0.06 -0.61 -0.30
1 1 223 14 312 0.12 0.86 0.92 -0.36 0.06 0.55 0.57
1 1 224 11 136 0.08 -0.61 0.92 -1.19 -1.11 -0.61 0.27
1 1 225 4 29 0.06 -0.72 0.92 -1.19 -1.11 -0.61 -1.37
1 1 226 16 285 0.24 0.74 0.09 -0.23 -0.28 0.55 0.77
1 1 227 5 368 0.04 0.64 0.09 0.82 1.62 -0.61 1.52
1 1 228 7 376 0.17 0.69 -0.78 0.72 0.06 2.88 0.44
1 1 229 21 281 0.13 -0.25 0.92 0.59 0.06 -0.61 0.30
1 1 230 8 65 0.06 -1.29 -1.20 -0.66 -0.72 -0.61 0.70
1 1 231 10 431 0.06 1.19 -0.78 3.06 3.19 -0.61 0.47
1 1 232 5 411 0.04 1.15 1.38 0.82 1.62 -0.61 1.52
1 1 233 9 141 0.13 -0.90 0.92 -0.69 -0.72 -0.61 -1.12
1 1 234 10 353 0.06 0.27 -0.78 1.73 1.62 -0.61 0.07
1 1 235 13 443 0.16 1.49 0.09 3.06 3.19 0.55 -0.92
1 1 236 15 143 0.15 -0.41 -0.78 -0.72 -0.72 0.55 0.77
1 1 237 12 255 0.09 0.13 -0.78 0.85 0.06 -0.61 0.48
1 1 238 12 53 0.1 -0.51 -0.85 -0.78 -0.72 -0.61 -1.67
1 1 239 15 16 0.16 -1.19 -0.92 -1.04 -0.87 -0.61 -1.67
1 1 240 6 270 0.08 -0.05 0.09 0.41 0.06 -0.61 1.52
1 1 241 12 145 0.15 -0.03 -0.85 -0.65 -0.72 -0.61 0.60
1 1 242 12 35 0.06 -1.36 -0.78 -1.19 -1.11 -0.61 0.83
1 1 243 19 126 0.07 -0.55 -0.78 -0.65 -0.72 -0.61 0.10
1 1 244 11 327 0.16 0.68 0.92 -0.56 0.06 0.55 -2.10
1 1 245 8 363 0.16 0.73 2.67 0.52 0.06 -0.61 0.94
1 1 246 21 362 0.37 0.78 -0.90 0.36 1.62 -0.12 -1.36
1 1 247 12 42 0.13 -1.42 -0.82 -0.68 -0.72 -0.61 -1.28
1 1 248 4 14 0.09 -2.06 -0.78 -1.21 -1.11 -0.61 -0.19
1 1 249 19 202 0.08 0.14 0.92 -0.67 -0.72 -0.61 0.17
1 1 250 13 146 0.06 -0.38 0.92 -1.19 -1.11 -0.61 0.70
1 1 251 6 201 0.08 0.23 -0.78 -0.53 0.06 -0.61 0.25
1 1 252 8 33 0.05 -0.42 0.09 -0.89 -0.72 -0.61 -2.27
1 1 253 18 209 0.1 -0.48 -0.78 0.29 0.06 -0.61 0.43
1 1 254 12 323 0.06 0.19 -0.78 0.82 1.62 -0.61 0.44
1 1 255 8 222 0.17 0.91 0.09 -0.44 -0.72 -0.61 0.56
1 1 256 14 383 0.23 0.55 0.95 0.83 1.62 0.55 0.34
1 1 257 10 300 0.08 0.07 0.92 0.38 0.06 -0.61 1.52
1 1 258 5 51 0.07 -1.53 -0.78 -1.04 -0.72 -0.61 0.72
1 1 259 8 261 0.16 0.03 0.09 -0.35 0.06 0.55 0.10
1 1 260 14 138 0.06 -0.29 -0.78 -0.64 -0.72 -0.61 0.13
1 1 261 15 20 0.19 -0.78 -0.78 -0.05 -0.72 2.88 -0.12
1 1 262 16 342 0.09 0.68 -0.78 0.84 1.62 -0.61 0.82
1 1 263 7 158 0.08 0.08 -0.78 -0.58 -0.72 -0.61 -0.44
1 1 264 17 176 0.21 -0.46 0.09 -0.72 -0.79 0.55 0.31
1 1 265 13 447 0.59 1.10 -0.39 2.55 2.58 2.88 -1.02
1 1 266 10 200 0.12 -0.59 -1.20 0.43 0.06 -0.61 1.32
1 1 267 11 61 0.07 -1.02 0.09 -1.19 -1.11 -0.61 0.89
1 1 268 9 440 0.38 3.40 0.07 3.38 1.62 -0.61 0.74
1 1 269 10 404 0.06 0.35 -0.78 1.73 1.62 0.55 -1.30
1 1 270 7 195 0.15 -0.55 0.09 -0.51 0.06 -0.61 0.53
1 1 271 12 225 0.14 0.46 0.92 -0.58 -0.72 -0.61 -0.35
1 1 272 16 189 0.06 -0.08 0.92 -0.66 -0.72 -0.61 0.80
1 1 273 16 430 0.45 1.11 2.67 0.55 0.75 2.88 0.67
1 1 274 10 217 0.1 -0.05 -1.20 0.13 0.06 -0.61 0.10
1 1 275 2 164 0.07 -0.37 -0.78 0.34 -0.72 -0.61 -1.37
1 1 276 10 82 0.05 -0.62 -1.20 -1.12 -0.72 -0.61 0.17
1 1 277 8 47 0.15 -1.05 -0.83 -0.73 -0.72 0.55 -1.16
1 1 278 4 13 0.05 -2.09 -0.78 -1.21 -1.11 -0.61 0.76
1 1 279 19 159 0.19 0.12 -0.94 -0.80 -0.72 2.88 0.39
1 1 280 4 6 0.03 -0.76 0.92 -1.29 -1.11 -0.61 -2.25
1 1 281 14 160 0.14 -0.93 -0.78 -0.02 -0.72 0.55 0.48
1 1 282 11 170 0.29 -0.61 1.08 -0.96 -0.93 0.55 1.25
1 1 283 2 442 0.05 2.89 0.92 3.06 1.62 -0.61 -1.37
1 1 284 7 210 0.09 0.59 -0.78 -0.41 -0.72 0.55 -0.19
1 1 285 8 25 0.07 -1.55 -1.20 -1.16 -0.72 -0.61 0.90
1 1 286 15 279 0.18 0.42 0.92 -0.34 0.06 -0.61 -1.31
1 1 287 9 266 0.08 -0.33 0.09 0.82 0.06 -0.61 -0.74
1 1 288 6 79 0.14 -1.67 -0.78 -0.22 -0.85 0.55 0.07
1 1 289 6 276 0.22 -0.15 0.09 0.18 -0.20 0.55 1.52
1 1 290 11 28 0.1 -1.17 -0.78 -1.19 -1.11 0.55 0.66
1 1 291 15 102 0.1 -0.90 -0.78 -0.65 -0.72 -0.61 -0.63
1 1 292 5 161 0.08 -0.10 0.92 -0.85 -0.72 -0.61 -1.67
1 1 293 4 113 0.03 -0.29 -0.78 -1.14 -0.72 -0.61 0.87
1 1 294 17 341 0.15 0.38 2.67 0.54 0.06 -0.61 0.13
1 1 295 4 427 0.11 2.53 -0.78 -0.05 1.62 2.88 0.48
1 1 296 20 313 0.1 0.37 0.92 0.32 0.06 0.55 0.33
1 1 297 20 60 0.08 -1.45 -0.78 -0.67 -0.72 -0.61 -0.74
1 1 298 12 165 0.19 -0.22 0.09 -0.66 -0.65 -0.61 -0.93
1 1 299 13 245 0.09 0.28 -0.78 0.33 0.06 -0.61 0.13
1 1 300 12 157 0.17 -0.18 -0.92 -0.80 -0.72 0.55 0.42
1 1 301 2 444 0 2.90 0.92 0.32 4.75 0.55 0.50
1 1 302 5 229 0.16 -0.80 0.59 -0.08 0.06 -0.61 1.32
1 1 303 6 75 0.11 -0.72 0.09 -1.19 -1.11 -0.61 -0.65
1 1 304 7 119 0.11 0.29 0.09 -0.74 -0.72 -0.61 -2.32
1 1 305 13 26 0.06 -1.22 -1.20 -1.19 -1.11 -0.61 0.70
1 1 306 15 445 0.19 1.69 0.92 3.06 3.19 0.55 -0.91
1 1 307 22 289 0.27 0.51 0.09 -0.50 -0.19 0.55 -2.00
1 1 308 12 290 0.16 0.02 0.92 -0.27 0.06 0.55 0.14
1 1 309 11 340 0.17 1.29 0.92 0.18 0.06 0.55 0.95
1 1 310 6 184 0.13 -0.46 0.09 0.05 -0.72 -0.61 1.52
1 1 311 12 183 0.13 -0.22 1.26 -0.68 -0.72 -0.61 1.31
1 1 312 10 115 0.09 -0.46 -1.20 -0.71 -0.72 -0.61 0.63
1 1 313 10 256 0.15 0.42 0.09 -0.19 0.06 -0.61 -1.49
1 1 314 9 254 0.18 0.04 0.92 -0.41 0.06 -0.61 0.50
1 1 315 9 62 0.14 -0.97 0.09 -0.90 -0.89 -0.61 1.52
1 1 316 8 144 0.18 -0.16 -0.83 -0.61 -0.72 0.55 -1.33
1 1 317 4 418 0.13 2.77 0.92 0.48 0.06 0.55 -2.36
1 1 318 10 399 0.17 0.73 -0.78 -0.54 -0.09 2.88 -2.18
1 1 319 11 149 0.14 -1.20 0.92 -0.61 -0.72 -0.61 -0.60
1 1 320 12 36 0.07 -1.10 -1.20 -1.19 -1.11 -0.61 0.11
1 1 321 10 381 0.19 1.04 0.09 0.79 1.62 0.55 0.41
1 1 322 2 414 0.02 0.66 -0.78 3.06 1.62 -0.61 -0.30
1 1 323 4 241 0.08 -0.78 0.92 -0.08 0.06 -0.61 -0.46
1 1 324 8 66 0.15 -1.35 -0.78 -0.63 -0.77 0.55 -0.56
1 1 325 12 162 0.15 -1.18 -0.96 -0.08 0.06 -0.61 0.62
1 1 326 6 127 0.1 -0.66 -0.92 -0.65 -0.72 0.55 -0.45
1 1 327 15 77 0.07 -0.62 -0.78 -1.19 -1.11 -0.61 0.11
1 1 328 11 315 0.12 0.44 1.38 0.42 0.06 -0.61 1.29
1 1 329 12 346 0.31 2.27 0.64 0.55 -0.07 -0.61 0.36
1 1 330 13 44 0.07 -1.51 -1.20 -1.08 -0.72 -0.61 0.12
1 1 331 19 251 0.14 -0.18 -0.91 0.33 0.06 0.55 0.61
1 1 332 13 91 0.07 -0.83 -1.20 -0.71 -0.72 -0.61 0.70
1 1 333 8 230 0.06 -0.50 -0.78 0.82 0.06 -0.61 -1.30
1 1 334 17 226 0.07 -0.14 -0.78 0.33 0.06 -0.61 0.30
1 1 335 9 258 0.21 0.26 0.09 -0.57 -0.11 0.55 -1.08
1 1 336 18 356 0.35 -0.17 -0.11 3.06 0.06 -0.42 -0.81
1 1 337 3 321 0.07 0.84 0.92 0.83 0.06 -0.61 0.63
1 1 338 13 64 0.14 -1.52 -0.91 -0.65 -0.72 -0.61 -0.37
1 1 339 12 268 0.14 0.16 0.09 0.45 0.06 -0.61 0.35
1 1 340 12 365 0.23 0.41 2.67 0.44 0.00 0.55 -0.30
1 1 341 6 125 0.14 -0.90 -0.99 -0.85 0.06 -0.61 0.87
1 1 342 7 264 0.05 -0.35 -0.78 0.87 0.06 0.55 0.47
1 1 343 4 140 0.07 -1.53 0.09 -0.08 -0.72 -0.61 -0.19
1 1 344 15 175 0.14 0.01 -0.78 -0.08 -0.72 -0.61 0.76
1 1 345 17 56 0.12 -0.88 -0.86 -1.16 -0.95 -0.61 -0.44
1 1 346 18 110 0.13 -1.08 -0.78 -0.65 -0.72 0.55 0.68
1 1 347 12 205 0.1 -0.55 0.92 -0.60 -0.72 0.55 0.61
1 1 348 7 330 0.06 0.30 -1.20 0.82 1.62 -0.61 0.32
1 1 349 11 194 0.09 -0.47 -0.78 -0.53 0.06 0.55 0.11
1 1 350 10 38 0.06 -0.15 -0.78 -0.63 -0.72 -0.61 -2.33
1 1 351 9 108 0.11 -1.28 0.09 -0.90 -0.72 -0.61 -0.02
1 1 352 13 169 0.1 -0.12 0.92 -1.17 -1.02 -0.61 0.14
1 1 353 14 235 0.17 -0.23 0.98 0.25 -0.72 -0.61 0.59
1 1 354 18 296 0.12 -0.43 0.92 0.44 0.06 0.55 0.27
1 1 355 25 93 0.05 -1.20 -0.78 -0.68 -0.72 -0.61 0.48
1 1 356 10 306 0.3 -0.51 0.34 -0.16 -0.72 2.88 0.01
1 1 357 16 334 0.25 0.22 -0.78 -0.01 0.06 2.88 0.93
1 1 358 24 303 0.1 0.52 0.92 0.33 0.06 -0.61 0.55
1 1 359 5 265 0.05 -0.41 0.09 0.82 0.06 -0.61 -1.30
1 1 360 4 355 0.04 0.58 0.09 0.82 1.62 -0.61 1.01
1 1 361 14 243 0.16 -0.57 -0.84 0.32 0.06 0.55 1.45
1 1 362 8 291 0.07 0.02 0.09 0.34 0.06 0.55 0.93
1 1 363 9 17 0.06 -1.58 -1.20 -1.19 -1.11 -0.61 0.91
1 1 364 21 386 0.17 0.34 2.67 0.46 0.06 0.55 1.45
1 1 365 3 203 0.15 -0.99 0.64 -0.28 0.06 -0.61 -1.30
1 1 366 5 76 0.04 -0.96 -0.78 -1.14 -0.72 -0.61 0.89
1 1 367 23 352 0.11 1.09 -0.80 0.83 1.62 -0.61 0.68
1 1 368 10 168 0.11 -0.60 -0.82 0.36 -0.72 -0.61 0.51
1 1 369 21 408 0.14 1.65 0.92 0.68 1.62 0.55 0.66
1 1 370 2 198 0.1 1.06 -0.35 -0.55 -0.72 -0.61 -1.08
1 1 371 9 182 0.15 -0.12 0.92 -0.75 -0.72 -0.61 -1.05
1 1 372 16 96 0.11 -0.64 -0.83 -0.62 -0.72 -0.61 -1.15
1 1 373 24 132 0.23 -0.12 2.67 -0.87 -0.86 -0.61 1.13
1 1 374 9 253 0.15 0.48 0.92 -0.03 -0.72 -0.61 0.79
1 1 375 12 249 0.19 0.22 -0.85 -0.29 0.06 0.55 -1.47
1 1 376 13 299 0.16 -0.43 0.92 0.14 0.06 0.55 1.36
1 1 377 8 350 0.23 -0.16 0.61 -0.08 0.06 2.88 -0.14
1 1 378 18 18 0.08 -0.80 -0.78 -0.93 -0.72 -0.61 -2.30
1 1 379 19 216 0.07 -0.22 -1.20 0.33 0.06 -0.61 0.74
1 1 380 3 32 0.01 -0.46 -0.78 -0.50 -0.72 -0.61 -2.35
1 1 381 12 301 0.16 0.55 0.92 0.26 0.06 -0.61 -0.19
1 1 382 6 1 0.1 -1.47 -1.13 -1.29 -1.11 -0.61 -2.29
1 1 383 7 7 0.19 -1.24 -0.96 -1.20 -1.05 0.55 -1.33
1 1 384 9 197 0.19 -1.03 0.92 -0.33 -0.72 0.55 -0.56
1 1 385 6 311 0.11 0.48 0.92 0.84 0.06 -0.61 0.32
1 1 386 9 335 0.17 1.28 0.92 -0.50 -0.72 0.55 -2.24
1 1 387 17 439 0.1 1.20 -0.78 3.06 3.19 0.55 -1.11
1 1 388 7 283 0.14 -0.69 0.09 0.70 0.06 0.55 -0.87
1 1 389 20 359 0.31 0.64 -0.91 0.74 1.62 0.55 0.62
1 1 390 19 231 0.15 -0.52 -0.81 0.82 0.06 -0.61 -0.71
1 1 391 15 319 0.14 0.19 0.92 0.67 0.06 0.55 -0.41
1 1 392 14 59 0.07 -1.14 -1.20 -1.06 -0.72 -0.61 0.11
1 1 393 9 406 0.23 0.59 0.82 0.37 0.06 2.88 1.41
1 1 394 5 438 0.04 1.57 0.92 3.06 3.19 -0.61 0.47
1 1 395 6 405 0.17 2.16 0.92 0.73 1.62 -0.61 0.43
1 1 396 9 343 0.21 1.24 -0.78 -0.34 -0.72 2.88 -0.02
1 1 397 15 244 0.14 0.09 0.92 -0.58 -0.72 0.55 0.48
1 1 398 8 121 0.1 -1.10 -0.78 -0.03 -0.72 -0.61 0.98
1 1 399 15 150 0.08 -0.56 0.09 -0.71 -0.72 -0.61 0.83
1 1 400 7 46 0.05 -1.39 -1.20 -0.80 -0.72 -0.61 0.91
1 1 401 12 428 0.34 1.36 0.92 3.22 1.62 0.45 0.22
1 1 402 9 99 0.07 -0.83 0.92 -1.19 -1.11 -0.61 0.91
1 1 403 6 387 0.16 1.18 0.37 0.46 1.62 -0.61 -1.47
1 1 404 13 237 0.05 0.09 -0.78 0.33 0.06 -0.61 0.83
1 1 405 10 72 0.11 -0.69 -1.20 -1.14 -0.76 -0.61 0.61
1 1 406 7 43 0.1 -1.66 -1.20 -0.83 -0.72 -0.61 0.72
1 1 407 7 152 0.1 -0.80 -0.90 -0.66 0.06 -0.61 0.12
1 1 408 18 284 0.2 -0.66 0.92 0.21 -0.02 0.55 -0.48
1 1 409 11 45 0.05 -1.33 -0.78 -1.19 -1.11 -0.61 0.38
1 1 410 8 133 0.11 -0.16 0.09 -0.77 -0.72 -0.61 -1.67
1 1 411 10 70 0.06 -0.64 -0.78 -1.19 -1.11 -0.61 0.66
1 1 412 19 425 0.32 1.23 0.96 0.59 1.62 2.88 0.48
1 1 413 17 415 0.32 0.68 0.92 1.84 1.62 0.55 -0.97
1 1 414 12 171 0.08 -0.51 0.92 -0.73 -0.72 -0.61 0.90
1 1 415 7 92 0.05 -0.73 -0.78 -1.13 -0.72 -0.61 0.75
1 1 416 3 223 0.06 0.74 -0.78 -0.53 0.06 -0.61 0.30
1 1 417 12 328 0.11 0.23 -0.82 0.82 1.62 -0.61 -0.32
1 1 418 12 190 0.18 -0.85 0.09 -0.02 -0.72 0.55 0.48
1 1 419 20 128 0.05 -0.48 -0.78 -0.67 -0.72 -0.61 0.50
1 1 420 10 94 0.21 -1.18 -0.82 -0.08 -0.56 -0.61 1.52
1 1 421 6 412 0.38 0.64 1.21 3.24 0.06 -0.23 0.16
1 1 422 7 378 0.11 2.26 0.92 0.55 0.06 -0.61 -1.42
1 1 423 7 232 0.17 -0.06 0.92 -0.66 -0.72 0.55 -0.92
1 1 424 8 137 0.26 -1.51 -0.56 0.43 -0.72 0.55 -1.07
1 1 425 16 391 0.37 1.32 0.35 -0.36 -0.52 2.88 0.81
1 1 426 7 215 0.11 0.18 -0.90 -0.14 0.06 -0.61 -1.08
1 1 427 11 120 0.17 -0.44 0.09 -1.19 -1.07 -0.61 0.71
1 1 428 23 112 0.05 -0.85 -0.78 -0.69 -0.72 -0.61 0.43
1 1 429 6 107 0.1 -0.96 0.09 -0.71 -0.72 -0.61 -1.27
1 1 430 9 423 0.09 1.46 2.67 0.82 1.62 -0.61 1.29
1 1 431 11 31 0.16 -1.04 -0.90 -1.19 -1.11 0.55 -0.04
1 1 432 5 377 0.06 1.46 0.09 0.82 1.62 -0.61 0.35
1 1 433 9 274 0.13 -0.64 0.92 -0.03 0.06 0.55 0.33
1 1 434 7 446 0.8 2.12 0.92 5.82 0.40 -0.12 0.43
1 1 435 6 280 0.3 1.20 0.92 -0.31 -0.59 -0.61 -0.47
1 1 436 11 192 0.08 -0.44 -0.78 -0.08 0.06 -0.61 0.80
1 1 437 10 30 0.14 -0.26 0.92 -1.01 -0.76 -0.61 -2.30
1 1 438 7 398 0.15 1.43 0.92 0.71 1.62 -0.61 -0.62
1 1 439 5 130 0.1 0.16 -0.95 -1.14 -0.72 -0.61 0.55
1 1 440 11 221 0.14 -0.20 0.92 0.00 -0.72 -0.61 -0.13
1 1 441 11 227 0.07 0.27 -0.78 -0.08 0.06 -0.61 0.08
1 1 442 12 238 0.1 0.12 -0.82 0.30 0.06 -0.61 0.54
1 1 443 11 174 0.16 -0.30 0.92 -0.88 -0.86 -0.61 -0.51
1 1 444 15 178 0.11 -0.43 0.92 -0.72 -0.72 -0.61 0.24
1 1 445 8 177 0.1 -0.28 -0.78 -0.62 0.06 -0.61 -0.06
1 1 446 26 433 0.31 2.54 0.92 0.26 1.62 2.88 0.30
1 1 447 14 317 0.18 0.09 -0.78 -0.15 0.06 2.88 0.13
1 1 448 22 448 0.58 1.99 2.67 3.06 2.90 -0.30 0.21
1 1 449 14 116 0.06 -0.75 -0.78 -0.64 -0.72 -0.61 0.73
1 1 450 9 361 0.07 0.89 0.09 0.82 1.62 -0.61 0.46

Now let us understand what each column in the above table means:

All the columns after this will contain centroids for each cell. They can also be called a codebook, which represents a collection of all centroids or codewords.

Now, let’s check the compression summary for HVT (map A). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.

mapA_compression_summary <- map_A[[3]]$compression_summary %>%  dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapA_compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 450 371 0.82 n_cells: 450 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, 82% of the cells have hit the quantization threshold error.Since we are successfully able to attain the desired compression percentage, so we will not further subdivide the cells

Now let’s try to understand plotHVT function. The parameters have been explained in detail below

plotHVT(hvt.results, line.width, color.vec, pch1 = 21, centroid.size = 3, title = NULL, maxDepth = 1)

Let’s plot the Voronoi tessellation for layer 1 (map A)

muHVT::plotHVT(map_A,
        line.width = c(0.4), 
        color.vec = c("#141B41"),
        centroid.size = 0.01,
        maxDepth = 1) 
Figure 1: The Voronoi Tessellation for layer 1 (map A) shown for the 450 cells in the dataset ’computers’

Figure 1: The Voronoi Tessellation for layer 1 (map A) shown for the 450 cells in the dataset ’computers’

We will now overlay all the features as heatmap over the Voronoi Tessellation plot for better visualization and identification of patterns, trends, and variations in the data.

Let’s have a look at the function hvtHmap that we will use to overlay features as heatmap.

hvtHmap(hvt.results, dataset, child.level, hmap.cols, color.vec ,line.width, palette.color = 6)

Now let’s plot the Voronoi Tessellation with the heatmap overlaid for all the features in the torus data for better visualization and interpretation of data patterns and distributions.

The heatmaps displayed below provides a visual representation of the spatial characteristics of the computers data, allowing us to observe patterns and trends in the distribution of each of the features (price,speed,hd,ram,screen,ads). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the computers data


  hvtHmap(
  map_A,
  scores,
  child.level = 1,
  hmap.cols = "price",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 450,
) 
Figure 2: The Voronoi Tessellation with the heat map overlaid for variable ’price’ in the ’computers’ dataset

Figure 2: The Voronoi Tessellation with the heat map overlaid for variable ’price’ in the ’computers’ dataset


  hvtHmap(
  map_A,
  scores,
  child.level = 1,
  hmap.cols = "speed",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 450,
) 
Figure 3: The Voronoi Tessellation with the heat map overlaid for variable ’speed’ in the ’computers’ dataset

Figure 3: The Voronoi Tessellation with the heat map overlaid for variable ’speed’ in the ’computers’ dataset


  hvtHmap(
  map_A,
  scores,
  child.level = 1,
  hmap.cols = "hd",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 450,
) 
Figure 4: The Voronoi Tessellation with the heat map overlaid for variable ’hd’ in the ’computers’ dataset

Figure 4: The Voronoi Tessellation with the heat map overlaid for variable ’hd’ in the ’computers’ dataset


  hvtHmap(
  map_A,
  scores,
  child.level = 1,
  hmap.cols = "ram",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 450,
) 
Figure 5: The Voronoi Tessellation with the heat map overlaid for variable ’ram’ in the ’computers’ dataset

Figure 5: The Voronoi Tessellation with the heat map overlaid for variable ’ram’ in the ’computers’ dataset


  hvtHmap(
  map_A,
  scores,
  child.level = 1,
  hmap.cols = "screen",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 450,
) 
Figure 6: The Voronoi Tessellation with the heat map overlaid for variable ’screen’ in the ’computers’ dataset

Figure 6: The Voronoi Tessellation with the heat map overlaid for variable ’screen’ in the ’computers’ dataset


  hvtHmap(
  map_A,
  scores,
  child.level = 1,
  hmap.cols = "ads",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 450,
) 
Figure 7: The Voronoi Tessellation with the heat map overlaid for variable ’ads’ in the ’computers’ dataset

Figure 7: The Voronoi Tessellation with the heat map overlaid for variable ’ads’ in the ’computers’ dataset

4 Map B : Compressed Novelty Map

In this section, we will manually figure out the novelty cells from the plotted map A and store it in identified_Novelty_cells variable.

The removeNovelty function removes the identified novelty cell(s) from the dataset and stores those records separately.

It takes input as the cell number (Segment.Child) of the manually identified novelty cell(s) from the above table and the compressed HVT map (map A). It returns a list of two items: dataset with novelty records, and a subset of the dataset without the novelty records.

identified_Novelty_cells <<- c(35,262,318)
output_list <- removeNovelty(identified_Novelty_cells, map_A)

[1] “The following cell(s) have been removed as outliers from the dataset: 35 262 318”

data_with_novelty <- output_list[[1]]
dataset_without_novelty <- output_list[[2]]

Note - In the dataset with novelty, the total number of cells would be equal to the total number of novelty cell(s) removed from the HVT map A. Let’s say, as in the above case where three cells (35th, 262nd, and 318th) are identified as the novelty and are removed. Then the 35th cell would be the first cell in the HVT map B, 262th as the second cell, and 318th as the third cell.

Let’s have a look at the data with novelties.For the sake of brevity, we will only show the first 10 rows

colnames(data_with_novelty) <- c("Cell.ID","Segment.Child","price","speed","hd","ram","screen","ads")
data_with_novelty %>% head(100) %>% 
  as.data.frame() %>%
  Table(scroll = T, limit = 20)
Cell.ID Segment.Child price speed hd ram screen ads
337 35 0.7042425 -0.783401 0.8177459 1.623997 -0.6149643 0.0369787
337 35 0.7949934 -0.783401 0.8177459 1.623997 -0.6149643 0.0369787
337 35 0.5445881 -0.783401 0.8177459 1.623997 -0.6149643 0.0369787
337 35 0.5445881 -0.783401 0.8177459 1.623997 -0.6149643 0.2443755
337 35 0.8807026 -0.783401 0.8177459 1.623997 -0.6149643 0.2443755
337 35 0.7949934 -0.783401 0.8177459 1.623997 -0.6149643 0.2443755
337 35 0.5445881 -0.783401 0.8177459 1.623997 -0.6149643 0.4996330
337 35 0.7126453 -0.783401 0.8177459 1.623997 -0.6149643 0.4996330
337 35 0.7949934 -0.783401 0.8177459 1.623997 -0.6149643 0.4996330
337 35 0.6269361 -0.783401 0.8177459 1.623997 -0.6149643 0.3720042
337 35 0.5361852 -0.783401 0.8177459 1.623997 -0.6149643 0.3720042
337 35 0.6269361 -0.783401 0.8177459 1.623997 -0.6149643 0.3720042
337 35 0.7949934 -0.783401 0.8177459 1.623997 -0.6149643 0.3720042
337 35 0.4588789 -0.783401 0.8177459 1.623997 -0.6149643 0.3720042
337 35 0.5429075 -0.783401 0.8177459 1.623997 -0.6149643 0.4677258
342 262 0.8790220 -0.783401 0.8748357 1.623997 -0.6149643 0.8665657
342 262 0.8807026 -0.783401 0.8748357 1.623997 -0.6149643 0.8665657
342 262 0.7949934 -0.783401 0.8748357 1.623997 -0.6149643 0.8665657
342 262 0.5445881 -0.783401 0.8748357 1.623997 -0.6149643 0.8665657
342 262 0.7042425 -0.783401 0.8748357 1.623997 -0.6149643 0.8665657

4.1 Voronoi tessellation to highlight novelty cell in the map

The plotCells function is used to plot the Voronoi tessellation using the compressed HVT map (map A) and highlights the identified novelty cell(s) in red on the map.

plotCells(identified_Novelty_cells, map_A,line.width = c(0.4),centroid.size = 0.01)
Figure 8: The Voronoi Tessellation constructed using the compressed HVT map (map A) with the novelty cell(s) highlighted in red

Figure 8: The Voronoi Tessellation constructed using the compressed HVT map (map A) with the novelty cell(s) highlighted in red

We pass the dataframe with novelty records to HVT function along with other model parameters mentioned below to generate map B (layer2)

Model Parameters

dataset_with_novelty <- data_with_novelty[,-1:-2]
map_B <- list()
map_B <- muHVT::HVT(dataset_with_novelty,
                  n_cells = 3,
                  depth = 1,
                  quant.err = 0.2,
                  projection.scale = 10,
                  normalize = F,
                  distance_metric = "L1_Norm",
                  error_metric = "max",
                  quant_method = "kmeans",
                  diagnose = F)

The datatable displayed below is the summary from map B (layer 2)

summaryTable(map_B[[3]]$summary,scroll = T,limit = 500)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error price speed hd ram screen ads
1 1 1 15 2 0.07 0.66 -0.78 0.82 1.62 -0.61 0.31
1 1 2 10 1 0.17 0.73 -0.78 -0.54 -0.09 2.88 -2.18
1 1 3 16 3 0.09 0.68 -0.78 0.84 1.62 -0.61 0.82

5 Map C : Compressed Map without Novelty

With the Novelties removed, we construct another hierarchical Voronoi tessellation map C layer 2 on the dataset without Novelty and below mentioned model parameters.

Model Parameters

map_C <- list()
map_C <- muHVT::HVT(dataset_without_novelty,
                  n_cells = 100,
                  depth = 2,
                  quant.err = 0.2,
                  projection.scale = 10,
                  normalize = F,
                  distance_metric = "L1_Norm",
                  error_metric = "max",
                  quant_method = "kmeans",
                  diagnose = F)

The datatable displayed below is the summary from map C (layer2)

summaryTable(map_C[[3]]$summary,scroll = T,limit = 500)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error price speed hd ram screen ads
1 1 1 70 633 0.17 -0.98 -0.84 -1.17 -0.99 -0.61 0.68
1 1 2 96 310 0.35 0.06 0.92 0.32 0.05 0.55 0.33
1 1 3 17 79 0.47 1.09 2.59 0.54 0.80 2.88 0.66
1 1 4 62 595 0.29 -1.12 -0.92 -0.79 -0.84 0.55 0.28
1 1 5 62 237 0.3 0.76 -0.90 0.70 1.62 -0.61 -0.18
1 1 6 39 318 0.4 0.53 0.62 0.29 0.06 -0.61 -0.83
1 1 7 52 485 0.29 -0.03 0.09 -0.61 -0.64 -0.61 0.49
1 1 8 45 136 0.37 0.38 -0.54 1.67 1.62 0.55 -0.89
1 1 9 69 361 0.3 -0.42 0.92 0.41 -0.03 -0.61 -0.20
1 1 10 42 373 0.27 -0.11 -0.83 0.52 0.03 0.55 0.38
1 1 11 85 586 0.27 -0.73 -0.83 -0.69 -0.71 -0.61 0.85
1 1 12 67 313 0.3 0.68 0.92 0.24 0.03 -0.61 0.50
1 1 13 40 434 0.36 0.71 -0.65 -0.27 -0.48 -0.61 0.30
1 1 14 30 179 0.35 1.18 0.48 0.69 1.62 -0.61 -0.42
1 1 15 70 371 0.39 0.50 0.47 -0.57 0.00 -0.61 -2.18
1 1 16 59 246 0.69 0.44 2.67 0.60 -0.02 -0.10 -0.02
1 1 17 36 550 0.21 -0.47 -0.87 -0.63 -0.70 -0.61 -0.47
1 1 18 66 341 0.33 0.40 0.05 -0.07 -0.04 0.55 0.53
1 1 19 59 70 1.07 2.61 0.86 0.46 1.25 2.88 0.19
1 1 20 28 516 0.35 -0.76 -0.56 -0.17 -0.73 2.88 -0.14
1 1 21 44 579 0.2 -0.71 0.09 -1.11 -1.00 -0.61 0.19
1 1 22 30 180 0.24 0.80 0.81 0.82 1.62 -0.61 1.21
1 1 23 42 585 0.32 -0.49 -0.83 -0.72 -0.62 -0.61 -1.28
1 1 24 42 419 0.39 0.40 0.80 -0.56 -0.69 -0.61 -0.33
1 1 25 72 605 0.34 -0.27 0.44 -0.99 -0.79 -0.61 -2.14
1 1 26 63 608 0.19 -0.68 -0.84 -1.15 -0.91 -0.61 0.00
1 1 27 33 369 0.33 0.36 -0.32 -0.34 -0.13 0.55 -0.83
1 1 28 56 390 0.43 0.30 -0.45 -0.53 -0.13 0.55 -2.09
1 1 29 40 627 0.35 -1.15 -0.92 -0.83 -0.82 0.55 1.09
1 1 30 43 266 0.46 1.05 0.92 -0.37 0.04 0.55 -1.96
1 1 31 34 142 0.51 2.61 0.52 0.47 0.11 -0.48 -2.11
1 1 32 46 639 0.2 -1.25 -0.86 -0.77 -0.75 -0.61 1.38
1 1 33 35 619 0.36 -0.27 -0.95 -0.76 -0.72 2.88 0.84
1 1 34 60 522 0.23 -0.13 -0.82 -0.57 -0.72 -0.61 0.40
1 1 35 77 163 0.42 1.31 0.63 0.85 1.62 -0.61 0.47
1 1 36 65 473 0.26 -0.04 0.94 -0.76 -0.71 -0.61 0.60
1 1 37 66 144 0.81 1.53 0.88 0.39 1.72 0.55 0.15
1 1 38 44 617 0.19 -1.39 -0.84 -0.66 -0.73 -0.61 -0.59
1 1 39 16 366 0.36 -0.70 -0.29 0.78 -0.03 0.55 -0.90
1 1 40 37 588 0.2 -0.36 -1.01 -1.08 -0.73 -0.61 0.58
1 1 41 58 351 0.25 0.37 0.09 0.27 0.06 -0.61 0.53
1 1 42 44 196 0.43 0.96 -0.50 0.03 1.62 -0.30 -2.01
1 1 43 71 17 1.75 1.58 -0.07 3.10 3.25 0.91 -0.84
1 1 44 68 533 0.29 0.03 -0.84 -0.56 -0.10 -0.61 -2.19
1 1 45 66 166 0.4 0.89 0.86 0.59 1.62 0.55 0.80
1 1 46 24 312 0.34 0.95 -0.74 -0.50 -0.72 2.88 0.35
1 1 47 13 277 0.32 1.98 -0.31 0.41 0.06 -0.26 0.43
1 1 48 44 320 0.38 -0.07 0.77 0.36 0.05 0.55 1.29
1 1 49 31 139 0.72 1.19 0.27 -0.42 -0.06 2.88 -1.83
1 1 50 48 255 0.28 0.43 2.67 0.38 0.06 -0.11 1.38
1 1 51 25 548 0.27 -1.38 -0.71 -0.02 -0.73 0.55 -0.56
1 1 52 42 498 0.31 -0.69 0.09 -0.57 -0.80 0.55 0.51
1 1 53 48 509 0.27 -0.28 -0.82 -0.58 -0.72 0.55 0.52
1 1 54 36 670 0.23 -1.60 -1.07 -1.18 -1.03 -0.61 0.96
1 1 55 84 607 0.27 -1.38 -0.82 -0.68 -0.73 -0.61 0.37
1 1 56 52 468 0.35 -0.36 -0.93 0.33 0.03 -0.61 1.19
1 1 57 45 598 0.23 -0.92 0.09 -0.94 -0.88 -0.61 0.98
1 1 58 43 43 1.16 2.03 0.74 3.50 2.13 -0.24 0.23
1 1 59 61 521 0.25 -0.67 0.92 -0.95 -0.88 -0.61 0.16
1 1 60 65 646 0.26 -1.38 -1.07 -1.14 -0.90 -0.61 0.17
1 1 61 38 624 0.38 -0.19 2.67 -0.78 -0.83 -0.37 0.99
1 1 62 40 217 0.36 0.64 -0.90 0.70 1.62 0.55 0.41
1 1 63 46 569 0.33 -1.32 -0.81 0.07 -0.58 -0.61 -0.64
1 1 64 35 442 0.36 -0.48 -0.83 0.08 0.02 0.55 1.09
1 1 65 58 518 0.32 -0.69 0.56 -0.69 -0.72 -0.61 -0.93
1 1 66 32 130 0.43 0.68 0.88 1.32 1.62 0.55 -0.86
1 1 67 73 565 0.12 -0.74 -0.78 -0.68 -0.72 -0.61 0.24
1 1 68 29 81 0.74 0.88 0.55 1.60 1.35 2.88 -0.48
1 1 69 50 600 0.13 -0.81 -1.20 -0.76 -0.72 -0.61 0.32
1 1 70 59 452 0.41 -0.37 0.95 -0.64 -0.74 0.55 0.65
1 1 71 85 241 0.36 0.55 -0.95 0.93 1.62 -0.61 0.68
1 1 72 157 416 0.28 -0.08 -0.87 0.40 0.05 -0.61 0.42
1 1 73 66 411 0.3 -0.25 -0.83 0.54 0.05 -0.61 -0.72
1 1 74 25 250 0.38 1.70 0.92 0.62 0.03 -0.57 0.76
1 1 75 40 484 0.32 -0.60 0.09 0.11 -0.56 -0.61 0.77
1 1 76 46 103 0.51 1.16 2.67 0.94 1.62 -0.13 0.57
1 1 77 44 549 0.4 -1.17 -0.84 0.04 -0.58 -0.61 0.36
1 1 78 23 592 0.26 -1.20 -0.84 -0.09 -0.51 -0.61 1.21
1 1 79 52 288 0.38 0.20 -0.64 0.10 -0.01 2.88 0.51
1 1 80 41 335 0.31 0.18 0.87 0.32 0.04 -0.61 1.37
1 1 81 37 185 0.72 2.32 1.11 0.84 -0.17 0.52 0.09
1 1 82 46 653 0.24 -1.27 -0.93 -0.95 -0.82 -0.61 -1.33
1 1 83 38 436 0.28 -0.59 -0.76 -0.07 0.04 0.55 0.11
1 1 84 67 173 0.58 1.25 0.68 0.07 -0.08 2.88 0.77
1 1 85 89 669 0.31 -0.86 -0.95 -1.01 -0.77 -0.54 -2.30
1 1 86 37 50 0.44 1.16 -0.78 3.06 3.10 -0.46 0.07
1 1 87 35 152 0.96 0.35 0.00 3.53 0.02 -0.28 -0.26
1 1 88 22 9 0.58 1.99 2.67 3.06 2.90 -0.30 0.21
1 1 89 50 408 0.3 -0.33 0.93 0.12 -0.47 -0.61 0.45
1 1 90 66 281 0.38 1.08 0.92 0.01 -0.09 0.55 0.42
1 1 91 19 655 0.2 -1.46 -1.04 -1.15 -0.88 -0.61 -0.45
1 1 92 45 239 0.51 0.18 0.82 0.12 -0.15 2.88 0.00
1 1 93 34 606 0.38 -0.93 -0.82 -0.86 -0.82 0.55 -1.02
1 1 94 25 527 0.27 -1.30 0.19 -0.25 -0.59 -0.61 -0.22
1 1 95 64 638 0.24 -1.38 -1.15 -0.91 -0.71 -0.61 0.72
1 1 96 57 577 0.33 -0.68 1.04 -0.93 -0.91 -0.61 1.15
1 1 97 43 454 0.43 -0.72 0.69 -0.44 -0.61 0.55 -0.65
1 1 98 41 495 0.25 -0.50 -0.82 -0.47 0.06 -0.61 0.42
1 1 99 53 367 0.38 -0.60 0.70 0.64 -0.03 -0.61 -0.98
1 1 100 53 291 0.31 -0.30 0.92 0.96 0.06 0.55 -0.63
2 1 1 0 NA NA NA NA NA NA NA NA
2 1 2 0 NA NA NA NA NA NA NA NA
2 1 3 0 NA NA NA NA NA NA NA NA
2 1 4 0 NA NA NA NA NA NA NA NA
2 1 5 0 NA NA NA NA NA NA NA NA
2 1 6 0 NA NA NA NA NA NA NA NA
2 1 7 0 NA NA NA NA NA NA NA NA
2 1 8 0 NA NA NA NA NA NA NA NA
2 1 9 0 NA NA NA NA NA NA NA NA
2 1 10 0 NA NA NA NA NA NA NA NA
2 1 11 0 NA NA NA NA NA NA NA NA
2 1 12 0 NA NA NA NA NA NA NA NA
2 1 13 0 NA NA NA NA NA NA NA NA
2 1 14 0 NA NA NA NA NA NA NA NA
2 1 15 0 NA NA NA NA NA NA NA NA
2 1 16 0 NA NA NA NA NA NA NA NA
2 1 17 0 NA NA NA NA NA NA NA NA
2 1 18 0 NA NA NA NA NA NA NA NA
2 1 19 0 NA NA NA NA NA NA NA NA
2 1 20 0 NA NA NA NA NA NA NA NA
2 1 21 0 NA NA NA NA NA NA NA NA
2 1 22 0 NA NA NA NA NA NA NA NA
2 1 23 0 NA NA NA NA NA NA NA NA
2 1 24 0 NA NA NA NA NA NA NA NA
2 1 25 0 NA NA NA NA NA NA NA NA
2 1 26 0 NA NA NA NA NA NA NA NA
2 1 27 0 NA NA NA NA NA NA NA NA
2 1 28 0 NA NA NA NA NA NA NA NA
2 1 29 0 NA NA NA NA NA NA NA NA
2 1 30 0 NA NA NA NA NA NA NA NA
2 1 31 0 NA NA NA NA NA NA NA NA
2 1 32 0 NA NA NA NA NA NA NA NA
2 1 33 0 NA NA NA NA NA NA NA NA
2 1 34 0 NA NA NA NA NA NA NA NA
2 1 35 0 NA NA NA NA NA NA NA NA
2 1 36 0 NA NA NA NA NA NA NA NA
2 1 37 0 NA NA NA NA NA NA NA NA
2 1 38 0 NA NA NA NA NA NA NA NA
2 1 39 0 NA NA NA NA NA NA NA NA
2 1 40 0 NA NA NA NA NA NA NA NA
2 1 41 0 NA NA NA NA NA NA NA NA
2 1 42 0 NA NA NA NA NA NA NA NA
2 1 43 0 NA NA NA NA NA NA NA NA
2 1 44 0 NA NA NA NA NA NA NA NA
2 1 45 0 NA NA NA NA NA NA NA NA
2 1 46 0 NA NA NA NA NA NA NA NA
2 1 47 0 NA NA NA NA NA NA NA NA
2 1 48 0 NA NA NA NA NA NA NA NA
2 1 49 0 NA NA NA NA NA NA NA NA
2 1 50 0 NA NA NA NA NA NA NA NA
2 1 51 0 NA NA NA NA NA NA NA NA
2 1 52 0 NA NA NA NA NA NA NA NA
2 1 53 0 NA NA NA NA NA NA NA NA
2 1 54 0 NA NA NA NA NA NA NA NA
2 1 55 0 NA NA NA NA NA NA NA NA
2 1 56 0 NA NA NA NA NA NA NA NA
2 1 57 0 NA NA NA NA NA NA NA NA
2 1 58 0 NA NA NA NA NA NA NA NA
2 1 59 0 NA NA NA NA NA NA NA NA
2 1 60 0 NA NA NA NA NA NA NA NA
2 1 61 0 NA NA NA NA NA NA NA NA
2 1 62 0 NA NA NA NA NA NA NA NA
2 1 63 0 NA NA NA NA NA NA NA NA
2 1 64 0 NA NA NA NA NA NA NA NA
2 1 65 0 NA NA NA NA NA NA NA NA
2 1 66 0 NA NA NA NA NA NA NA NA
2 1 67 0 NA NA NA NA NA NA NA NA
2 1 68 0 NA NA NA NA NA NA NA NA
2 1 69 0 NA NA NA NA NA NA NA NA
2 1 70 0 NA NA NA NA NA NA NA NA
2 1 71 0 NA NA NA NA NA NA NA NA
2 1 72 0 NA NA NA NA NA NA NA NA
2 1 73 0 NA NA NA NA NA NA NA NA
2 1 74 0 NA NA NA NA NA NA NA NA
2 1 75 0 NA NA NA NA NA NA NA NA
2 1 76 0 NA NA NA NA NA NA NA NA
2 1 77 0 NA NA NA NA NA NA NA NA
2 1 78 0 NA NA NA NA NA NA NA NA
2 1 79 0 NA NA NA NA NA NA NA NA
2 1 80 0 NA NA NA NA NA NA NA NA
2 1 81 0 NA NA NA NA NA NA NA NA
2 1 82 0 NA NA NA NA NA NA NA NA
2 1 83 0 NA NA NA NA NA NA NA NA
2 1 84 0 NA NA NA NA NA NA NA NA
2 1 85 0 NA NA NA NA NA NA NA NA
2 1 86 0 NA NA NA NA NA NA NA NA
2 1 87 0 NA NA NA NA NA NA NA NA
2 1 88 0 NA NA NA NA NA NA NA NA
2 1 89 0 NA NA NA NA NA NA NA NA
2 1 90 0 NA NA NA NA NA NA NA NA
2 1 91 0 NA NA NA NA NA NA NA NA
2 1 92 0 NA NA NA NA NA NA NA NA
2 1 93 0 NA NA NA NA NA NA NA NA
2 1 94 0 NA NA NA NA NA NA NA NA
2 1 95 0 NA NA NA NA NA NA NA NA
2 1 96 0 NA NA NA NA NA NA NA NA
2 1 97 0 NA NA NA NA NA NA NA NA
2 1 98 0 NA NA NA NA NA NA NA NA
2 1 99 0 NA NA NA NA NA NA NA NA
2 1 100 0 NA NA NA NA NA NA NA NA
2 2 1 14 290 0.1 -0.06 0.92 0.87 0.06 0.55 0.38
2 2 2 10 274 0.16 0.52 0.92 0.87 -0.02 0.55 0.46
2 2 3 10 300 0.11 0.21 0.92 0.28 0.06 0.55 -0.23
2 2 4 19 326 0.17 0.10 0.92 -0.20 0.02 0.55 0.36
2 2 5 22 340 0.12 -0.47 0.92 0.18 0.06 0.55 0.30
2 2 6 21 293 0.1 0.36 0.92 0.33 0.06 0.55 0.49
2 2 7 0 NA NA NA NA NA NA NA NA
2 2 8 0 NA NA NA NA NA NA NA NA
2 2 9 0 NA NA NA NA NA NA NA NA
2 2 10 0 NA NA NA NA NA NA NA NA
2 2 11 0 NA NA NA NA NA NA NA NA
2 2 12 0 NA NA NA NA NA NA NA NA
2 2 13 0 NA NA NA NA NA NA NA NA
2 2 14 0 NA NA NA NA NA NA NA NA
2 2 15 0 NA NA NA NA NA NA NA NA
2 2 16 0 NA NA NA NA NA NA NA NA
2 2 17 0 NA NA NA NA NA NA NA NA
2 2 18 0 NA NA NA NA NA NA NA NA
2 2 19 0 NA NA NA NA NA NA NA NA
2 2 20 0 NA NA NA NA NA NA NA NA
2 2 21 0 NA NA NA NA NA NA NA NA
2 2 22 0 NA NA NA NA NA NA NA NA
2 2 23 0 NA NA NA NA NA NA NA NA
2 2 24 0 NA NA NA NA NA NA NA NA
2 2 25 0 NA NA NA NA NA NA NA NA
2 2 26 0 NA NA NA NA NA NA NA NA
2 2 27 0 NA NA NA NA NA NA NA NA
2 2 28 0 NA NA NA NA NA NA NA NA
2 2 29 0 NA NA NA NA NA NA NA NA
2 2 30 0 NA NA NA NA NA NA NA NA
2 2 31 0 NA NA NA NA NA NA NA NA
2 2 32 0 NA NA NA NA NA NA NA NA
2 2 33 0 NA NA NA NA NA NA NA NA
2 2 34 0 NA NA NA NA NA NA NA NA
2 2 35 0 NA NA NA NA NA NA NA NA
2 2 36 0 NA NA NA NA NA NA NA NA
2 2 37 0 NA NA NA NA NA NA NA NA
2 2 38 0 NA NA NA NA NA NA NA NA
2 2 39 0 NA NA NA NA NA NA NA NA
2 2 40 0 NA NA NA NA NA NA NA NA
2 2 41 0 NA NA NA NA NA NA NA NA
2 2 42 0 NA NA NA NA NA NA NA NA
2 2 43 0 NA NA NA NA NA NA NA NA
2 2 44 0 NA NA NA NA NA NA NA NA
2 2 45 0 NA NA NA NA NA NA NA NA
2 2 46 0 NA NA NA NA NA NA NA NA
2 2 47 0 NA NA NA NA NA NA NA NA
2 2 48 0 NA NA NA NA NA NA NA NA
2 2 49 0 NA NA NA NA NA NA NA NA
2 2 50 0 NA NA NA NA NA NA NA NA
2 2 51 0 NA NA NA NA NA NA NA NA
2 2 52 0 NA NA NA NA NA NA NA NA
2 2 53 0 NA NA NA NA NA NA NA NA
2 2 54 0 NA NA NA NA NA NA NA NA
2 2 55 0 NA NA NA NA NA NA NA NA
2 2 56 0 NA NA NA NA NA NA NA NA
2 2 57 0 NA NA NA NA NA NA NA NA
2 2 58 0 NA NA NA NA NA NA NA NA
2 2 59 0 NA NA NA NA NA NA NA NA
2 2 60 0 NA NA NA NA NA NA NA NA
2 2 61 0 NA NA NA NA NA NA NA NA
2 2 62 0 NA NA NA NA NA NA NA NA
2 2 63 0 NA NA NA NA NA NA NA NA
2 2 64 0 NA NA NA NA NA NA NA NA
2 2 65 0 NA NA NA NA NA NA NA NA
2 2 66 0 NA NA NA NA NA NA NA NA
2 2 67 0 NA NA NA NA NA NA NA NA
2 2 68 0 NA NA NA NA NA NA NA NA
2 2 69 0 NA NA NA NA NA NA NA NA
2 2 70 0 NA NA NA NA NA NA NA NA
2 2 71 0 NA NA NA NA NA NA NA NA
2 2 72 0 NA NA NA NA NA NA NA NA
2 2 73 0 NA NA NA NA NA NA NA NA
2 2 74 0 NA NA NA NA NA NA NA NA
2 2 75 0 NA NA NA NA NA NA NA NA
2 2 76 0 NA NA NA NA NA NA NA NA
2 2 77 0 NA NA NA NA NA NA NA NA
2 2 78 0 NA NA NA NA NA NA NA NA
2 2 79 0 NA NA NA NA NA NA NA NA
2 2 80 0 NA NA NA NA NA NA NA NA
2 2 81 0 NA NA NA NA NA NA NA NA
2 2 82 0 NA NA NA NA NA NA NA NA
2 2 83 0 NA NA NA NA NA NA NA NA
2 2 84 0 NA NA NA NA NA NA NA NA
2 2 85 0 NA NA NA NA NA NA NA NA
2 2 86 0 NA NA NA NA NA NA NA NA
2 2 87 0 NA NA NA NA NA NA NA NA
2 2 88 0 NA NA NA NA NA NA NA NA
2 2 89 0 NA NA NA NA NA NA NA NA
2 2 90 0 NA NA NA NA NA NA NA NA
2 2 91 0 NA NA NA NA NA NA NA NA
2 2 92 0 NA NA NA NA NA NA NA NA
2 2 93 0 NA NA NA NA NA NA NA NA
2 2 94 0 NA NA NA NA NA NA NA NA
2 2 95 0 NA NA NA NA NA NA NA NA
2 2 96 0 NA NA NA NA NA NA NA NA
2 2 97 0 NA NA NA NA NA NA NA NA
2 2 98 0 NA NA NA NA NA NA NA NA
2 2 99 0 NA NA NA NA NA NA NA NA
2 2 100 0 NA NA NA NA NA NA NA NA
2 3 1 3 27 0.07 1.76 2.67 0.81 1.62 2.88 1.18
2 3 2 5 105 0.13 0.93 2.67 0.42 0.06 2.88 0.16
2 3 3 2 98 0.06 0.71 2.67 0.13 0.06 2.88 1.52
2 3 4 2 73 0.02 1.03 2.67 0.85 0.06 2.88 1.52
2 3 5 1 100 0 0.75 1.38 0.34 1.62 2.88 0.47
2 3 6 4 58 0.13 1.07 2.67 0.58 1.62 2.88 0.08
2 3 7 0 NA NA NA NA NA NA NA NA
2 3 8 0 NA NA NA NA NA NA NA NA
2 3 9 0 NA NA NA NA NA NA NA NA
2 3 10 0 NA NA NA NA NA NA NA NA
2 3 11 0 NA NA NA NA NA NA NA NA
2 3 12 0 NA NA NA NA NA NA NA NA
2 3 13 0 NA NA NA NA NA NA NA NA
2 3 14 0 NA NA NA NA NA NA NA NA
2 3 15 0 NA NA NA NA NA NA NA NA
2 3 16 0 NA NA NA NA NA NA NA NA
2 3 17 0 NA NA NA NA NA NA NA NA
2 3 18 0 NA NA NA NA NA NA NA NA
2 3 19 0 NA NA NA NA NA NA NA NA
2 3 20 0 NA NA NA NA NA NA NA NA
2 3 21 0 NA NA NA NA NA NA NA NA
2 3 22 0 NA NA NA NA NA NA NA NA
2 3 23 0 NA NA NA NA NA NA NA NA
2 3 24 0 NA NA NA NA NA NA NA NA
2 3 25 0 NA NA NA NA NA NA NA NA
2 3 26 0 NA NA NA NA NA NA NA NA
2 3 27 0 NA NA NA NA NA NA NA NA
2 3 28 0 NA NA NA NA NA NA NA NA
2 3 29 0 NA NA NA NA NA NA NA NA
2 3 30 0 NA NA NA NA NA NA NA NA
2 3 31 0 NA NA NA NA NA NA NA NA
2 3 32 0 NA NA NA NA NA NA NA NA
2 3 33 0 NA NA NA NA NA NA NA NA
2 3 34 0 NA NA NA NA NA NA NA NA
2 3 35 0 NA NA NA NA NA NA NA NA
2 3 36 0 NA NA NA NA NA NA NA NA
2 3 37 0 NA NA NA NA NA NA NA NA
2 3 38 0 NA NA NA NA NA NA NA NA
2 3 39 0 NA NA NA NA NA NA NA NA
2 3 40 0 NA NA NA NA NA NA NA NA
2 3 41 0 NA NA NA NA NA NA NA NA
2 3 42 0 NA NA NA NA NA NA NA NA
2 3 43 0 NA NA NA NA NA NA NA NA
2 3 44 0 NA NA NA NA NA NA NA NA
2 3 45 0 NA NA NA NA NA NA NA NA
2 3 46 0 NA NA NA NA NA NA NA NA
2 3 47 0 NA NA NA NA NA NA NA NA
2 3 48 0 NA NA NA NA NA NA NA NA
2 3 49 0 NA NA NA NA NA NA NA NA
2 3 50 0 NA NA NA NA NA NA NA NA
2 3 51 0 NA NA NA NA NA NA NA NA
2 3 52 0 NA NA NA NA NA NA NA NA
2 3 53 0 NA NA NA NA NA NA NA NA
2 3 54 0 NA NA NA NA NA NA NA NA
2 3 55 0 NA NA NA NA NA NA NA NA
2 3 56 0 NA NA NA NA NA NA NA NA
2 3 57 0 NA NA NA NA NA NA NA NA
2 3 58 0 NA NA NA NA NA NA NA NA
2 3 59 0 NA NA NA NA NA NA NA NA
2 3 60 0 NA NA NA NA NA NA NA NA
2 3 61 0 NA NA NA NA NA NA NA NA
2 3 62 0 NA NA NA NA NA NA NA NA
2 3 63 0 NA NA NA NA NA NA NA NA
2 3 64 0 NA NA NA NA NA NA NA NA
2 3 65 0 NA NA NA NA NA NA NA NA
2 3 66 0 NA NA NA NA NA NA NA NA
2 3 67 0 NA NA NA NA NA NA NA NA
2 3 68 0 NA NA NA NA NA NA NA NA
2 3 69 0 NA NA NA NA NA NA NA NA
2 3 70 0 NA NA NA NA NA NA NA NA
2 3 71 0 NA NA NA NA NA NA NA NA
2 3 72 0 NA NA NA NA NA NA NA NA
2 3 73 0 NA NA NA NA NA NA NA NA
2 3 74 0 NA NA NA NA NA NA NA NA
2 3 75 0 NA NA NA NA NA NA NA NA
2 3 76 0 NA NA NA NA NA NA NA NA
2 3 77 0 NA NA NA NA NA NA NA NA
2 3 78 0 NA NA NA NA NA NA NA NA
2 3 79 0 NA NA NA NA NA NA NA NA
2 3 80 0 NA NA NA NA NA NA NA NA
2 3 81 0 NA NA NA NA NA NA NA NA
2 3 82 0 NA NA NA NA NA NA NA NA
2 3 83 0 NA NA NA NA NA NA NA NA
2 3 84 0 NA NA NA NA NA NA NA NA
2 3 85 0 NA NA NA NA NA NA NA NA
2 3 86 0 NA NA NA NA NA NA NA NA
2 3 87 0 NA NA NA NA NA NA NA NA
2 3 88 0 NA NA NA NA NA NA NA NA
2 3 89 0 NA NA NA NA NA NA NA NA
2 3 90 0 NA NA NA NA NA NA NA NA
2 3 91 0 NA NA NA NA NA NA NA NA
2 3 92 0 NA NA NA NA NA NA NA NA
2 3 93 0 NA NA NA NA NA NA NA NA
2 3 94 0 NA NA NA NA NA NA NA NA
2 3 95 0 NA NA NA NA NA NA NA NA
2 3 96 0 NA NA NA NA NA NA NA NA
2 3 97 0 NA NA NA NA NA NA NA NA
2 3 98 0 NA NA NA NA NA NA NA NA
2 3 99 0 NA NA NA NA NA NA NA NA
2 3 100 0 NA NA NA NA NA NA NA NA
2 4 1 10 562 0.09 -1.07 -0.78 -0.67 -0.72 0.55 0.05
2 4 2 6 555 0.1 -0.69 -1.13 -0.68 -0.72 0.55 0.08
2 4 3 11 648 0.15 -1.37 -1.16 -1.10 -0.97 0.55 0.31
2 4 4 16 573 0.15 -1.06 -0.91 -0.67 -0.72 0.55 0.49
2 4 5 5 537 0.08 -1.24 -0.78 -0.08 -0.72 0.55 0.38
2 4 6 11 622 0.1 -0.99 -0.78 -1.19 -1.11 0.55 0.31
2 4 7 3 626 0.1 -1.76 -0.78 -0.57 -0.98 0.55 0.07
2 4 8 0 NA NA NA NA NA NA NA NA
2 4 9 0 NA NA NA NA NA NA NA NA
2 4 10 0 NA NA NA NA NA NA NA NA
2 4 11 0 NA NA NA NA NA NA NA NA
2 4 12 0 NA NA NA NA NA NA NA NA
2 4 13 0 NA NA NA NA NA NA NA NA
2 4 14 0 NA NA NA NA NA NA NA NA
2 4 15 0 NA NA NA NA NA NA NA NA
2 4 16 0 NA NA NA NA NA NA NA NA
2 4 17 0 NA NA NA NA NA NA NA NA
2 4 18 0 NA NA NA NA NA NA NA NA
2 4 19 0 NA NA NA NA NA NA NA NA
2 4 20 0 NA NA NA NA NA NA NA NA
2 4 21 0 NA NA NA NA NA NA NA NA
2 4 22 0 NA NA NA NA NA NA NA NA
2 4 23 0 NA NA NA NA NA NA NA NA
2 4 24 0 NA NA NA NA NA NA NA NA
2 4 25 0 NA NA NA NA NA NA NA NA
2 4 26 0 NA NA NA NA NA NA NA NA
2 4 27 0 NA NA NA NA NA NA NA NA
2 4 28 0 NA NA NA NA NA NA NA NA
2 4 29 0 NA NA NA NA NA NA NA NA
2 4 30 0 NA NA NA NA NA NA NA NA
2 4 31 0 NA NA NA NA NA NA NA NA
2 4 32 0 NA NA NA NA NA NA NA NA
2 4 33 0 NA NA NA NA NA NA NA NA
2 4 34 0 NA NA NA NA NA NA NA NA
2 4 35 0 NA NA NA NA NA NA NA NA
2 4 36 0 NA NA NA NA NA NA NA NA
2 4 37 0 NA NA NA NA NA NA NA NA
2 4 38 0 NA NA NA NA NA NA NA NA
2 4 39 0 NA NA NA NA NA NA NA NA
2 4 40 0 NA NA NA NA NA NA NA NA
2 4 41 0 NA NA NA NA NA NA NA NA
2 4 42 0 NA NA NA NA NA NA NA NA
2 4 43 0 NA NA NA NA NA NA NA NA
2 4 44 0 NA NA NA NA NA NA NA NA
2 4 45 0 NA NA NA NA NA NA NA NA
2 4 46 0 NA NA NA NA NA NA NA NA
2 4 47 0 NA NA NA NA NA NA NA NA
2 4 48 0 NA NA NA NA NA NA NA NA
2 4 49 0 NA NA NA NA NA NA NA NA
2 4 50 0 NA NA NA NA NA NA NA NA
2 4 51 0 NA NA NA NA NA NA NA NA
2 4 52 0 NA NA NA NA NA NA NA NA
2 4 53 0 NA NA NA NA NA NA NA NA
2 4 54 0 NA NA NA NA NA NA NA NA
2 4 55 0 NA NA NA NA NA NA NA NA
2 4 56 0 NA NA NA NA NA NA NA NA
2 4 57 0 NA NA NA NA NA NA NA NA
2 4 58 0 NA NA NA NA NA NA NA NA
2 4 59 0 NA NA NA NA NA NA NA NA
2 4 60 0 NA NA NA NA NA NA NA NA
2 4 61 0 NA NA NA NA NA NA NA NA
2 4 62 0 NA NA NA NA NA NA NA NA
2 4 63 0 NA NA NA NA NA NA NA NA
2 4 64 0 NA NA NA NA NA NA NA NA
2 4 65 0 NA NA NA NA NA NA NA NA
2 4 66 0 NA NA NA NA NA NA NA NA
2 4 67 0 NA NA NA NA NA NA NA NA
2 4 68 0 NA NA NA NA NA NA NA NA
2 4 69 0 NA NA NA NA NA NA NA NA
2 4 70 0 NA NA NA NA NA NA NA NA
2 4 71 0 NA NA NA NA NA NA NA NA
2 4 72 0 NA NA NA NA NA NA NA NA
2 4 73 0 NA NA NA NA NA NA NA NA
2 4 74 0 NA NA NA NA NA NA NA NA
2 4 75 0 NA NA NA NA NA NA NA NA
2 4 76 0 NA NA NA NA NA NA NA NA
2 4 77 0 NA NA NA NA NA NA NA NA
2 4 78 0 NA NA NA NA NA NA NA NA
2 4 79 0 NA NA NA NA NA NA NA NA
2 4 80 0 NA NA NA NA NA NA NA NA
2 4 81 0 NA NA NA NA NA NA NA NA
2 4 82 0 NA NA NA NA NA NA NA NA
2 4 83 0 NA NA NA NA NA NA NA NA
2 4 84 0 NA NA NA NA NA NA NA NA
2 4 85 0 NA NA NA NA NA NA NA NA
2 4 86 0 NA NA NA NA NA NA NA NA
2 4 87 0 NA NA NA NA NA NA NA NA
2 4 88 0 NA NA NA NA NA NA NA NA
2 4 89 0 NA NA NA NA NA NA NA NA
2 4 90 0 NA NA NA NA NA NA NA NA
2 4 91 0 NA NA NA NA NA NA NA NA
2 4 92 0 NA NA NA NA NA NA NA NA
2 4 93 0 NA NA NA NA NA NA NA NA
2 4 94 0 NA NA NA NA NA NA NA NA
2 4 95 0 NA NA NA NA NA NA NA NA
2 4 96 0 NA NA NA NA NA NA NA NA
2 4 97 0 NA NA NA NA NA NA NA NA
2 4 98 0 NA NA NA NA NA NA NA NA
2 4 99 0 NA NA NA NA NA NA NA NA
2 4 100 0 NA NA NA NA NA NA NA NA

Now let’s check the compression summary for HVT (map C). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.

mapC_compression_summary <- map_C[[3]]$compression_summary %>%  dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapC_compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 100 6 0.06 n_cells: 100 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans
2 577 577 1 n_cells: 100 quant.err: 0.2 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, 6% of the cells have hit the quantization threshold error in level 1 and 100% of the cells have hit the quantization threshold error in level 2

Let’s plot the Voronoi tessellation for layer 2 (map C)

muHVT::plotHVT(map_C,
        line.width = c(0.4,0.2), 
        color.vec = c("#141B41","#0582CA"),
        centroid.size = 0.1,
        maxDepth = 2) 
Figure 9: The Voronoi Tessellation for layer 2 (map C) shown for the 100 cells in the dataset ’computers’ at level 2

Figure 9: The Voronoi Tessellation for layer 2 (map C) shown for the 100 cells in the dataset ’computers’ at level 2

Now let’s plot all the features for each cell at level two as a heatmap for better visualization.

The heatmaps displayed below provides a visual representation of the spatial characteristics of the computers data, allowing us to observe patterns and trends in the distribution of each of the features (price,speed,hd,ram,screen,ads). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the computers data


  hvtHmap(
  map_C,
  scores,
  child.level = 2,
  hmap.cols = "price",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
) 
Figure 10: The Voronoi Tessellation with the heat map overlaid for features price in the ’computers’ dataset

Figure 10: The Voronoi Tessellation with the heat map overlaid for features price in the ’computers’ dataset


  hvtHmap(
  map_C,
  scores,
  child.level = 2,
  hmap.cols = "speed",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
) 
Figure 11: The Voronoi Tessellation with the heat map overlaid for features speed in the ’computers’ dataset

Figure 11: The Voronoi Tessellation with the heat map overlaid for features speed in the ’computers’ dataset


  hvtHmap(
  map_C,
  scores,
  child.level = 2,
  hmap.cols = "hd",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
) 
Figure 12: The Voronoi Tessellation with the heat map overlaid for features hd in the ’computers’ dataset

Figure 12: The Voronoi Tessellation with the heat map overlaid for features hd in the ’computers’ dataset


  hvtHmap(
  map_C,
  scores,
  child.level = 2,
  hmap.cols = "ram",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
) 
Figure 13: The Voronoi Tessellation with the heat map overlaid for features ram in the ’computers’ dataset

Figure 13: The Voronoi Tessellation with the heat map overlaid for features ram in the ’computers’ dataset


  hvtHmap(
  map_C,
  scores,
  child.level = 2,
  hmap.cols = "screen",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
) 
Figure 14: The Voronoi Tessellation with the heat map overlaid for features screen in the ’computers’ dataset

Figure 14: The Voronoi Tessellation with the heat map overlaid for features screen in the ’computers’ dataset


  hvtHmap(
  map_C,
  scores,
  child.level = 2,
  hmap.cols = "ads",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = T,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
) 
Figure 15: The Voronoi Tessellation with the heat map overlaid for features ads in the ’computers’ dataset

Figure 15: The Voronoi Tessellation with the heat map overlaid for features ads in the ’computers’ dataset

We now have the set of maps (map A, map B & map C) which will be used to predict which map and cell each test record is assigned to, but before that lets view our test dataset

6 Prediction on Test Data

Now once we have built the model, let us try to predict using our test dataset which cell and which layer each point belongs to.

The predictLayerHVT function is used to score the test data using the predictive set of maps. This function takes an input - a test data and a set of maps (map A, map B, map C).

Let’s have a look at our scaled test dataset.

# Quick peek
testComputers <- scale(testComputers, center = scale_attr$`scaled:center`, scale = scale_attr$`scaled:scale`) %>% as.data.frame()
testComputers1 <- round(testComputers,4)
Table(head(testComputers1))
price speed hd ram screen ads
-1.2284 -0.7834 -0.6761 -0.7176 0.549 -0.8405
1.3832 0.0920 3.0633 3.1850 0.549 -0.8405
-0.8016 0.0920 -0.6761 -0.7176 -0.615 -0.8405
0.2303 2.6667 -0.4097 -0.7176 -0.615 -0.8405
0.3076 0.9159 1.7312 1.6240 0.549 -0.8405
-0.5075 0.9159 3.0633 0.0630 -0.615 -0.8405

Now let us check the structure of the test data and analyse its summary

str(testComputers)
#> 'data.frame':    1253 obs. of  6 variables:
#>  $ price : num  -1.228 1.383 -0.802 0.23 0.308 ...
#>  $ speed : num  -0.783 0.092 0.092 2.667 0.916 ...
#>  $ hd    : num  -0.676 3.063 -0.676 -0.41 1.731 ...
#>  $ ram   : num  -0.718 3.185 -0.718 -0.718 1.624 ...
#>  $ screen: num  0.549 0.549 -0.615 -0.615 0.549 ...
#>  $ ads   : num  -0.84 -0.84 -0.84 -0.84 -0.84 ...
summary(testComputers)
#>      price             speed               hd               ram          
#>  Min.   :-2.2216   Min.   :-0.7834   Min.   :-1.0995   Min.   :-1.10781  
#>  1st Qu.:-1.0368   1st Qu.: 0.0920   1st Qu.: 0.3420   1st Qu.: 0.06296  
#>  Median :-0.6167   Median : 0.9159   Median : 0.8986   Median : 0.06296  
#>  Mean   :-0.4276   Mean   : 0.9755   Mean   : 1.4372   Mean   : 0.60148  
#>  3rd Qu.: 0.1228   3rd Qu.: 1.3793   3rd Qu.: 2.3497   3rd Qu.: 1.62400  
#>  Max.   : 2.8957   Max.   : 2.6667   Max.   : 8.2965   Max.   : 4.74606  
#>      screen             ads         
#>  Min.   :-0.6150   Min.   :-3.2654  
#>  1st Qu.:-0.6150   1st Qu.:-1.8296  
#>  Median : 0.5490   Median :-1.4627  
#>  Mean   : 0.4672   Mean   :-1.7811  
#>  3rd Qu.: 0.5490   3rd Qu.:-1.2872  
#>  Max.   : 2.8768   Max.   : 0.7708

Now, Let us understand the predictLayerHVT function -

predictLayerHVT(data,
                map_A,
                map_B,
                map_C,
                mad.threshold = 0.2,
                normalize = T, 
                distance_metric="L1_Norm",
                error_metric="max",
                child.level = 1, 
                line.width = c(0.6, 0.4, 0.2),
                color.vec = c("#141B41", "#6369D1", "#D8D2E1"),
                yVar= NULL,
                ...)

Each of the parameters of predictLayerHVT function has been explained below:

The function predicts based on the HVT maps - map A, map B and map C, constructed using HVT function. For each test record, the function will assign that record to Layer1 or Layer2. Layer1 contains the cell ids from map A and Layer 2 contains cell ids from map B (novelty map) and map C (map without novelty).

Note : The prediction algorithm will not work if some of the variables used to perform quantization are missing. In the test dataset, we should not remove any features

Let’s see which cell and layer each point belongs to.For the sake of brevity, we will only show the first 20 rows

validation_data <- testComputers
new_predict <- predictLayerHVT(
    data=validation_data,
    map_A,
    map_B,
    map_C,
    normalize = F
  )
new_predict%>% head(100) %>% 
  as.data.frame() %>%
  Table(scroll = T, limit = 20)
Row.Number Layer1.Cell.ID Layer2.Cell.ID
1 A66 C606
2 A443 C17
3 A107 C518
4 A282 C419
5 A415 C130
6 A356 C367
7 A443 C17
8 A241 C367
9 A212 C518
10 A60 C617
11 A439 C17
12 A231 C411
13 A409 C136
14 A415 C130
15 A283 C366
16 A409 C136
17 A135 C518
18 A283 C366
19 A135 C527
20 A445 C17

Note: From the above table, we can see that 343rd observation from the test data having Cell.ID as A399 have been identified as novelty and is mapped to 1st(B1) novelty cell in Layer2.Cell.ID column .Similarly, 367th,423rd……. test record having Cell.ID as A399 is correctly identified as novelty and gets mapped to B1 novelty cell in the Layer2.Cell.ID column respectively

7 Executive Summary

8 References

  1. Topology Preserving Maps : https://link.springer.com/chapter/10.1007/1-84628-118-0_7

  2. Vector Quantization : https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-450-principles-of-digital-communications-i-fall-2006/lecture-notes/book_3.pdf

  3. K-means : https://en.wikipedia.org/wiki/K-means_clustering

  4. Sammon’s Projection : http://en.wikipedia.org/wiki/Sammon_mapping

  5. Voronoi Tessellations : http://en.wikipedia.org/wiki/Centroidal_Voronoi_tessellation